Embarcados interview: Clive "Max" Maxfield
Clive "Max" Maxfield received his B.Sc. in Control Engineering in 1980 from Sheffield Hallam University, Sheffield, England.
He began his career as a designer of central processing units (CPUs) for mainframe computers.
Over the years, Max has designed everything from silicon chips to circuit boards, and from brainwave amplifiers to Steampunk "Display-O-Meters". He has also been at the forefront of Electronic Design Automation (EDA) for more than 20 years.
Max's numerous technical articles have appeared in a wide variety of magazines, including ED, EDN, Chip Design, EE Times, PCB Design, and the electronics and computing hobbyist magazine Everyday Practical Electronics (EPE). Also, he has held contributing editor or executive editor roles at Programmable Logic DesignLine, Chip Design Magazine, SOC Central, and Everyday Practical Electronics. Max has presented papers at technical conferences around the world; e.g., in 2010 he presented at the Embedded Systems Conference (ESC) Silicon Valley in April, ESC India in July, and the Embedded Live Conference in London, England in October; in 2012 he gave the keynote presentation at the FPGA Forum in Norway and also a guest lecture at Oslo University in Norway.
Max is the author and/or co-author of a number of books, including Designus Maximus Unleashed (Banned in Alabama), Bebop To The Boolean Boogie (An Unconventional Guide to Electronics), Bebop BYTES Back (An Unconventional Guide to Computers), EDA: Where Electronics Begins, 3D Graphics on Windows NT, The Design Warrior's Guide to FPGAs, FPGAs: Instant Access, and How Computers Do Math.
Max was interviewed by André Castelan Prado in the 15th of May, checkout what Max has to say about Embedded systems and FPGAs! Also checkout his awesome office.
Max recieved us, virtually, at his office. This is the entrance to "Max's World, where the colors are brighter, the butterflies are bigger, the birds sing sweeter, and the beer is plentiful and cold."
André: Is this office the source of your creativity? You’ve been doing some fun stuff lately!
Max: My office is the place where I keep things that my wife doesn't let me play at home (laughs). I wish you could be here (Max pauses the interview and demonstrated things like Thin Cans made of Ceramic, which he made himself and stuff like that all over the shelves).
I used to be an independent consultant, then I took a job at EETimes as an editor of the Programmable Logic Designline. EETimes has big offices in New York and San Francisco. Theoretically I work from my home, but I prefer to have a real office. First of all it makes you feel like you have a real job (laughs). In the morning I get my car and I drive to the office, which is like a little sanctuary. When I first started to work, "back in the day," you had to wear suits and shoes and shirts and ties. When I went independent, I got tired of wearing suits; more recently, about a year ago, I went to my wardrobe and threw everything out.
I gave it to the charity shops and tried to simplify things a lot. Now I think I have three pairs of shorts, three pairs of jeans, a lot of Hawaiian shirts, a few T-shirts and that’s it. These kinds of clothes make me feel happy and I wear them all the time, including when I'm giving presentations at conferences.
André: What about the AllProgrammablePlanet migration to EETimes?
Max: It was a bit awkward to be honest. When I was asked to set up AllProgrammablePlanet.com, the whole mission was to create a community and it was sponsored by Xilinx but that wasn’t all about Xilinx -- instead it was about having an all programmable planet -- everything to do with FPGAs.
We did a pretty good job and we ran for about 18 months. We really had a great community going, and a lot of people made good friendships there. A lot of people from AllProgrammablePlanet came over to EETimes -- people like Duane Benson, Adam Taylor, Crusty, and bunch of others. We still keep in touch and maintain our friendship, and I think that this community spirit is carrying on at EETimes.
André: You cover Programmable logic, Microcontrollers, PCB, Prototyping... how can you handle all that? And also, why digital systems and programmable logic?
Max: Well, it helps that I’ve been around for a long time. I predate EDA (Electronic Design Automation) as we know it. I graduated from university by 1980. My first job was as a member of a team designing CPUs for mainframe computers.
You've got to remember that programmable logic wasn’t around that time. All of this predates what we know refer to as digital signal processing (DSP); you really didn’t do stuff like this in real-time in digital computers because we simply didn’t have that amount of computing power.
Also there was a big split between analog and digital. When I was at university, we had both analog and digital computers. We could do a lot of things in analog. Some people decided to focus on analog, but I personally found digital was very logical and it suited me well.
A lot of computing was still very new at that time, and we could really invent things. I learned a huge amount there. We were pretty much the state-of-the-art at that time. For example, I designed my first ASIC in 1980. These devices contained only around 2,000 logic gates, which doesn’t seem like much, but we were designing them as gate-level schematics using pencil and paper. There were no logic synthesis tools and no high level languages -- it wasn't until around 1990 that commercial logic synthesis came along and the whole design paradigm changed.
In the early days of my career we had simple logic devices like PLDs and a few CPLDs, but no FPGAs. It wasn't until 1985 that FPGAs started to appear on the scene, and they were very simple. As designers, we really didn’t see too much potential for them; we used them to implement a few simple tasks like lookup tables (LUTs), glue logic, and rudimentary state machines. Remember that, until that time, we only had microcontrollers and simply TTL IC chips available to us, along with really simple PLDS. So when FPGAs came along we largely saw them as an easy way to gather multiple logic gates into a single chip.
André: How FPGAs started?
Max: When Xilinx released their first FPGA in 1985, it was a relative simple device. It was an 8 by 8 matrix of programmable logic cells, where each cell had one 4-input LUT and a register and a multiplexer. There wasn’t really logic synthesis that time -- you programmed each LUT by hand -- it was very different to today.
At that time, we never had any conception of the sort of devices we have today, with multiple billions of transistors and everything else. The first FPGAs were very simple fabric with just an array of simple programmable elements; then they started adding things like blocks of memory, DSP slices, PLLs, etc.
Both Altera and Xilinx experimented with having hardcore processors -- Altera had ARM stripes and Xilinx had the PowerPC. Unfortunately, at that time no one was really interested and no one really cared. Altera and Xilinx spent a lot of money there and really didn’t make much of it back.
Those days were sort of like the way things are now in Brazil. Most people at that time didn’t really understand the potential of FPGAs. Also, the FPGAs of that time didn’t offer tremendously high performance, and designers could obtain high performance with standalone microcontrollers.
Another consideration is that there are a lot of people who have grown up programming in software. In the case of embedded systems, the vast majority of people are used to programming with microcontrollers. It’s relatively easy when you think about it -- you can buy a microcontroller for a couple of dollars, put it on a board with a bit of memory, and then you can write your program. If you make a mistake, you can easily change the C code.
What people tend to forget is that microcontrollers are extremely good for making decision based logic "If this is true and that is true then do this..." Microcontrollers are great for stuff like this. However, they are not so good when it comes to doing algorithmic data processing -- that is, when you have lots of data and you are performing some algorithm to process the data over and over again.
When I say microcontrollers I mean microprocessors also -- both of these are just about the most inefficient way of performing these calculations known to humankind because they do things sequentially, instruction by instruction, which is very slow and very inefficient. The only reason that everything looks so efficient and impressive these days is because we are talking about running out processors with clock frequencies around 2.4 gigahertz, which is pretty much the same frequency we use in a microwave oven. Scary! The end result is that we’ve got processor chips that consume a humongous amount of power.
When it comes to embedded systems, the fact that microcontrollers aren’t tremendously efficient with regard to processing data wasn't as important in the not-so-distant past. This was because the majority of embedded systems were things like thermostats, washing machine, auto lights, and so forth with not much data processing going on. The thing is that when we think about modern embedded systems, we are thinking about the Internet of Things (IoT). In this case, we can be performing a lot of data processing. The point is that digital data processing algorithms are inherently parallel in nature, and we can use FPGAs to perform operations in a massively parallel fashion.
High-end digital signal processor chips that you program in C are really fast, but they also cost a lot of money. Also, they simply can’t compete against an FPGA running at a much lower clock frequency, because the FPGA can process a huge amount of data in parallel.
André: And why do you think FPGAs don’t get much love from the developers?
Max: There are a number of reasons. One big reason is that everybody prefers to use what they are familiar with -- people don’t like to change and some people are scared of change. They are still selling around a billion 8-bit 8051-based devices a year, because lots of people are still comfortable with this architecture. Also, a lot of these people are still programming in assembly language because it’s what they have been doing in the last twenty years -- they don’t know C and they don’t want to bridge that gap.
When I started out, I met people who used to design with vacuum tubes, and they had found it very hard to switch to transistors. I also met people who would had learned to design using individual transistors, capacitors, and resistors, and they could not switch their minds over to using TTL Logic -- they simply couldn’t grasp the concept. I know it sounds weird, but they couldn’t.
So when we come to today, one problem is that the people who are used to writing things in a programming language like C think of things as being very sequential -- even with things like multiple processor cores and multiple threads, people still think about software in a sequential way. By comparison, the hardware design engineers that design FPGAs think about things happening concurrently -- they are familiar with hardware in which everything can be happening at once. The languages they use, like VHDL and Verilog, are aimed towards this type of hardware design.
I’ve heard quite a few people say that if you are a software guy and you are going to learn hardware design, learn VHDL not Verilog. This is because VHDL is very different to C, while Verilog is very similar to C. So if you have software developers trying to work with Verilog, you end up with them0 trying to write software instead of hardware, and this doesn’t end up quite well.
André: What do you think about high level synthesis tools? Do you think the same thing is going to happen that happened to C and Assembly?
Max: These tools are coming along incredibly well, so I think we will ultimately see something similar to what happened with C and assembly language. I remember when people programmed in assembly and then C arrived on the scene along with C compilers. The people who coded in assembly language said: “Look I can create better code in assembly than your compiler, because I can optimize everything.” Well, that is probably true if you have a small project, but if you are trying to write something like Microsoft Word in Assembly, it’s not going to happen in a lifetime.
Also, the C compilers got better and better, and now the code that comes out of the C compiler is very efficient and so on. Also, the people that write in C can explode different design scenarios pretty easily, much more easily than they can in assembly.
The same thing applies to hardware description languages (HDLs) like Verilog and VHDL. When HDLs and logic synthesis came along, I remember all the hardware designers saying that it wouldn’t catch on -- that they could do better designs by hand with paper and pencil than with a hardware synthesis engine. And, once again, that might have been true in the beginning. But even then, you could change one line of HDL code and obtain a different result. This is the way to go -- the ability to quickly and easily explore different design scenarios. And the synthesis engine got better and better with decisions and optimizations, and now it’s very hard for a designer to beat a synthesis engine.
With high level synthesis (HLS), the same sort of thing applies. The first kind of high level synthesis (which was maybe 10 or 12 years ago) was called "Behavioral Synthesis. The results were pretty bad and people really didn’t like it (that’s why they don’t call it behavior synthesis anymore LOL). But today's HLS is coming along quite well.
For example, Xilinx introduced something recently which is a domain specific high level synthesis for networks. This is where you specify at a high level abstraction what you want to do with your data processing plane. The top-level of the design is performed by network experts who specify things in the same way they specify a network. They define things at a very high level abstraction and then press a "Go" button. The HLS tool takes this description and generates the RTL; in turn, this RTL feeds into the Vivado suite of tools that ultimately programs the FPGA.
André: Do today's RTL Engineers need to adapt or die?
Max: It’s not so much an "adapt or die" sort of thing. On the one hand things are changing very quickly; on the other hand things do have a way of persisting.
When it comes to today's FPGA designs, we typically think about those designs being captured in RTL and run through a synthesis tool. However, there are a lot of people in Asia and India who are still designing FPGAs using schematic tools because that’s the way they know how to do it.
So I think what’s going to happen is that those RTL designers that don’t want to go to the next level will continue to do things the way they are used to and eventually they will retire. Of course it may be that the company will say “Look we need someone who can do this, can you?” and if they say "No" then they will be retired forcibly. But I think a more common scenario is that existing designers will be so good at what they do that they cannot easily be replaced. On the other hand, the next generation of designers will use the latest and greatest tools and they will push the technology further.
Having said this, high level synthesis is a reality, is not used by majority of designers, but there are people who are doing different things.
André: What about RTL Design in Python with MyHDL?
Max: I believe that MyHDL is very good, although I’ve never used it myself. I am trying to learn Python, but I never seem to have time for anything. I’ve heard tremendously things about MyHDL. Its inventor, Jan, is a very, very clever guy -- very knowledgeable -- and he created MyHDL to overcome the problems with others HDLs. You can code things up in other HDLs that you think are going to be synthesized one way and they end up being synthesized another way. With MyHDL you can’t do that -- when you code it you know just what’s going to happen.
Also, because it’s in Python, you can run it immediately and verify it, then when you are ready to go you press the "Go" button and it generates VHDL or Verilog -- whichever you want.
If I met someone who was starting out as a new designer and had to do a new project from scratch, I would say to look carefully at MyHDL because it might be the way to go. If that way if you have a customer who wants VHDL, for example, you just press the VHDL Button and then you have the VHDL. If your next customer wants Verilog, you can take your original MyHDL design and use it to generate Verilog.
André: It’s very hard to convince older RTL Engineers to make the switch to MyHDL or High Level Synthesis, they still think you can’t get low logic or fast enough design.
Max: With MyHDL you can create something that’s very compact and expand it to VHDL, the proof of the pudding is in the tasting, as they say. Jan is great at that -- he will say “give me a problem” then he will create an example in MyHDL which is very compact and very small - and it works. He will simulate it in Python and show that it works, then he generates the VHDL or Verilog from it.
When younger engineers show they can do things quicker and better using new tool sand techniques than older engineers, this will prompt the older engineers to make the switch.
André: Maybe that can lower the barrier to enter in the field of FPGA for the software guys, right?
Max: Well, again, it‘s going to be very difficult to persuade a software developer to learn how to write HDL, even with tools like Catapult C, which used to be owned by Mentor until they sold it. You can capture your design in C or C++ and verify it at that level, then you can use Catapult C to identify things like loops and areas in which you can perform things like resource sharing, etc. For each decision you make, Catapult C will estimate things like latency, time-of-flight, and resource utilization, which makes it really easy to explore different design scenarios. When you are ready you press the "Go" button and Catapult C generates the equivalent RTL.
The problem is at that all of this is still something that software designers cannot easily do. To a large extent you still have to write your C/C++ thinking about how the hardware is to be implemented and to have some sort of idea as to where things are going. So I think that currently there is still going to be a hardware/software divide.
Now you have Altera and Xilinx with SoC type FPGAs that have multiple hard ARM cores combined with programmable fabric. When you have one of these devices on your board, the software guys can start programming pretty much immediately. They can write their code and profile it and say “I need to accelerate that function” and then the hardware guys can take that C/C++ code use the high level synthesis to convert it to RTL, and there you are.
The thing to be careful of is that it’s not going to be a "one size fits it all" situation. There are some cases where you look at a design and say "All I want to do is to detect when a door opens and, if it doesn’t, I want to turn this light on. I can buy a cheap-and-cheerful microcontroller for a dollar and program ir really quickly, so for this project a microcontroller is the obvious solution,"
At the other end of the spectrum you could build a custom ASIC/SoC, which will give you tremendous capacity and performance, but then all of your algorithms are "frozen in silicon" and if you mess something up it’s going to be very expensive to go and fix it. The big advantage of an FPGA is that you can re-program it. In some cases you might say "The best solution for this design is to perform the decision making functions with a microcontroller and the data processing functions in and FPGA. However, it may be that your bottleneck now gets to be the communication between the two devices. If you can put the two devices -- the microcontroller and the programmable FPGA fabric -- into the same package, now you can communicate at silicon speeds again. Which brings us back to the SoC FPGAs with dual core ARM Cortex-A9 processors (more powerful processors will be available in future generations). These devices mean you can get huge amounts of throughput.
André: We saw the maker movement lately, with Arduino, Beaglebone Black, Raspi and those boards are getting a lot of people interested in the embedded systems field, do we have an FPGA contender?
Max: When I was young, lots of people were interested in electronics. A lot of the people I knew had a solder iron and used to play with stuff. Over time this kind of thing seemed to be dying out. And then, more recently, the Arduino arrived on the scene and suddenly lots of people are making things again and this is very good.
There’s the Papilio board which is an FPGA. When it powers it up the default program makes it look like an Arduino so you can put your C program directly just like an Arduino. Or you can reprogram the FPGA on the Papilio and use it like an FPGA
Then there's the Arduissimo, which is a board that looks like an Arduino Mega, but it contains an FPGA that is programmed to look like 16 Arduino cores. That's pretty interesting. There are two ways of doing this -- one way is to actually replicate 16 cores, but that’s not its creator -- Tobias Strauch -- has done. Instead, Tobias has done something called System Hyper Pipelining (SHP) where each functional block of an Arduino is implemented, but then he multiplexes all the inputs and outputs so that it appears as though you get 16 cores. That way he can put up to 16 Arduinos in an FPGA that should really only handle only a couple.
The big thing about an Arduino is that when you first power it up the LED starts blinking -- you are already seeing something happening. You can get the C code for the default blink program, make a small change, upload your new program to the Arduino, and see it all working immediately. You can be up and running in just a few minutes -- everything is incredibly easy.
Now try doing this with a FPGA. Just downloading the vendor's software can take a long time. Then you need to license it, then you need to do 'this' and then you need to do 'that.' This can bring even strong people to their knees. This is not to say that professional engineers find FPGAs to be too hard to use, it's just that -- generally speaking -- they are not ready for the hobbyist market. Apart from anything else, if you are a beginner working with a microcontroller like an Arduino, then there are a lot of people out there that can help you and teach you. It’s very hard to find someone to teach you how to use an FPGA.
Of course things are different for professional designers creating real-world systems. It’s becoming incredibly expensive to create new devices from the ground up. Designing an SoC at say the 20 nanometer technology node costs a humongous amount of money -- you have to be selling millions of units to make this investment worthwhile.
With an FPGA you can make a 20 nm part and sell it to a lot of customers, because they can reprogram it to do whatever they want to do. Also we have to remember that FPGAs now cover the entire spectrum. There are high-end devices that offer tremendous capacity and performance, but they cost a lot of money and consume a lot of power. Then there are ultra-low density (ULD) devices that cost only a dollar or so and can be used in small, handheld, battery-powered products.
Let's return to the IoT. In the not-so-distant future there are going to be a lot of people and small companies designing things to get into the IoT. A lot of these products are going to require a lot of data processing, and it’s going to get to a stage where you can’t process all that data with microcontrollers because they consume too much power. The solution will be to use FPGAs, which means people are going to have to learn how to design with them. When it gets to the stage that you only know how to design with microcontrollers, but your competitors are designing smaller, faster, more efficient, cheaper products using FPGAs, then you are going to have to learn FPGAs or you won’t be able to compete. Knowing that this is going to happen, it would be a wise move to start learning FPGAs sooner rather than later.
Once again, none of this is to say that FPGAs will be used for everything -- there is no "silver bullet" when it comes to designing an embedded system. System architects need to be able to look at each project and say "for this project, the best solution is…" Making trade-offs is is a large part of what engineering is all about -- taking someone’s concept and implementing it in the most cost-effective, reliable, risk-free way.
André: What do you expect of your work in the next years?
Max: As you know, I started out as a hardware design engineer and sort of drifted into the writing side. I started writing magazine articles, and then moved on to writing books like Bebop to the Boolean Boogie (An Unconventional Guide to Electronics), How Computers Do Math, and FPGAs: Instant Access. This is how I came to be invited to act as an editor at EETimes.
There are several great things about my current position – first I’m right at the forefront of what’s going on in EDA and electronics – companies call me up to tell me of their latest and greatest devices and tools and technologies. I really enjoy writing (which is unusual for an engineer) and – in addition to the technical articles – I also get to write about my hobby projects (like my Inamorata Prognostication Engine and my BADASS Display) and other stuff that interest me like Augmented Reality. Also, I get to give presentations and papers at conferences around the world. Last month, for example, I presented 2 or 3 times a day at the EE Live! embedded systems conference and exhibition. In 2010, I spoke at ESC India. In 2012 I was invited to give the keynote presentation at the FPGA Forum in Norway, and also to give a guest lecture at the University of Oslo in Norway.
So, with regard to what I think I’ll be doing in the years to come, I think the answer is “more of the same” – writing, presenting, and building hobby projects. One thing I would really like to do is be invited to give a 2-hour “Introduction to FPGAs” at ESC Brazil one year (hint, hint )
André: What do you consider to be the next big thing in the FPGA industry?
Max: The problem are that there are going to be so many “next big things”. One thing I would point out is that FPGAs are now appearing all over the place. Of course we have the extremely high-end devices from Altera and Xilinx containing billions of transistors – but we also have ultra-low-density (ULD) parts from folks like Lattice Semiconductor that are appearing in battery-powered, handheld devices.
Altera have just announced FPGAs containing thousands of floating-point DSP blocks, which will revolutionize all sorts of data processing tasks like radar and beam-forming. The new generation of Xilinx FPGAs offer ASIC-like characteristics with regard to thinks like clocking and routing. I don’t think it will be long before we see FPGAs that input and output optical data directly. Xilinx are already shipping what they call 3D FPGAs, in which multiple FPGA dice are mounted on a silicon interposer with more than 10,000 connections between adjacent dice. Honestly, I think the future will be more amazing than we can dream.
André: What do you consider to be the next big thing in the embedded system industry?
Max: Things are moving so fast these days. Remember that the first iPhone was introduced in 2007, which is only 7 years ago. The first iPad was introduced in 2010, which is only 4 years ago. Now we cannot imagine life without access to these sorts of devices. I think we are going to see a much greater deployment of embedded systems in every part of our lives. The Internet of Things is really going to happen. This means many things to many people – large systems and small systems will all be connected – but to me the really interesting thing is to have vast numbers of really small systems packed with sensors all linked together and providing us the unimaginable amounts of data – the real trick will be to mine this data and transform it into information.
I also think that we are poised to see a vast array of embedded systems equipped with embedded vision and embedded speech (both understanding spoken commands and issuing spoken responses) capabilities. I also thing that augmented reality (AR) is going to become huge, where AR refers to a live direct or indirect view of a physical, real-world environment whose elements are augmented by computer-generated sensory input such as sound, video, graphical, and textural data. (check out my Augmented Reality column for more details).
André: Have you ever met a Brazilian engineer working with FPGA or embedded systems in general?
Max: Just one guy at Allprogrammableplanet, a professor at a Brazilian university. I may have met others but they don’t go saying “hey I am a Brazilian”, they go “Hey Max, would you like a beer?”
I believe there is an ESC in Brazil. I would love to come to Brazil -- I hear that it is a wonderful country. I don’t think I will be coming to ESC Brazil this year (unless I'm invited really soon LOL) but -- if not -- I would very much like to make it next year.
André: Max thank you very much for your time! It was an honor to have you interviewed.
Max: My great pleasure, thanks for asking me. If they send me to ESC Brazil next year I am going to give you a shout. And then we can buy each other a beer.
Embarcados interview: Clive "Max" Maxfield por André Castelan Prado. Esta obra está sob a licença Creative Commons Atribuição-CompartilhaIgual 4.0 Internacional.
É com muito prazer que trazemos mais uma entrevista, desta vez entrevistamos ninguém menos que o surpreendente Jack Ganssle. Jack, para quem não conhece, é um...
We gladly bring to the public one more interview, this time we interviewed the surprising Jack Ganssle. Jack, for those who don't know him, is one...