Just about every electronic technology on the market today has alternatives. Between custom chips, ASSPs, pre-built modules, embedded processors, microcontrollers, FPGAs, and a host of other silicon-based goodies, there are always numerous ways to solve any given problem. As engineers, we make our choice based on any number of criteria – cost, power, size, reliability, our familiarity and experience with the technology, our company’s preferences… all of them weigh into our decision.
Some solutions – like full custom chips – occupy an “end” of the solution space. When you have enough budget, a talented and experienced design team, a large production volume to amortize development costs, and hard technical demands in areas like system performance, power consumption, integration, or unit cost – custom ICs are the limit of what can be done. On the other end of the spectrum are solutions like pre-manufactured boards and modules with embedded processors. One guy in a garage can write a little code and practically ship a new product by himself.
Between these extremes lies the Valley. The Valley is a rich and fertile marketing ground, inhabited by all manner of furry-friendly woodland creatures with big welcoming eyes and even happier datasheets. FPGAs live here in the Valley. They are cheaper to design than custom ICs, yet more capable than fixed-hardware boards. They require far less expertise and risk to design than ASICs, but they are much more challenging to use than simply programming an embedded processor. They can offer tremendous power advantages over a dedicated DSP chip, but they don’t quite have the degree of efficiency for many battery-powered applications.
FPGAs are bounded by alternative solutions in just about every dimension and direction. Choose anything you want to measure – cost, power, size, performance – and FPGAs will be in between two other viable alternatives. Unfortunately, this means that FPGAs are always under attack, and always defending and challenging borders. For years, FPGAs have been billing themselves as “ASIC replacements,” trying to shave off the bottom end of the ASIC market – those who can no longer afford to do an ASIC on the latest process node. In return, low-cost ASIC companies made a business on “FPGA conversion” – taking FPGA designs and reducing unit cost while improving power and performance by converting them to less-than-the-latest-node ASIC processes.
FPGAs also attack the processor space – and get attacked in return. FPGAs offer an integration story – your CPU and any combination of peripherals can be integrated into a single programmable logic device. Processors retaliate by offering the CPU with most of the peripherals you could want – all integrated into a lower-cost, higher-performance device. FPGAs offer the ability to accelerate performance-critical parts of your application, and application processors retaliate by getting faster and faster, obviating the need for hardware acceleration.
Every direction you look, FPGA technology is under attack from alternative solutions. And each FPGA company is under attack as well – from other FPGA competitors. Often, the companies lose sight of the external boundaries and focus their energy on wrestling market share points from like-minded competition. These are expensive revenues to earn. It’s an attractive battle to fight because you know your enemy, and the score is easily divined by market share reports. What is more difficult to see is the state of the battle against the outside alternatives. As long as the field where you’re grazing at the moment is rich and green, you may not notice that the entire Valley is quietly closing in around you.
The real key to the battle for these borders is tools. As we have mentioned many times, most of the cost of an FPGA is margin. FPGA companies charge an enormous premium over the raw cost of the silicon they sell. Where does all that money go? Into private jets and limos for FPGA company executives, of course. No, just kidding. Most of that money goes into the development, maintenance, and support of the tool suites required for FPGA design. The two largest FPGA companies have giant engineering teams working on tools and IP – certainly larger than the teams who work on the development of the FPGAs themselves. At stake in the tool battle is nothing less than the viability of the FPGA companies themselves, for without winning tools, engineers will flee to alternative solutions – whether those are the other company’s FPGAs or something else entirely.
The challenges of producing high-quality tools for FPGA design just keep getting tougher, too. Every two years, like clockwork, the FPGA companies roll out a new device family on a new process node with double the LUTs, more than double the routing, more memory, more hard-wired macros, and longer clock lines. Each one of these improvements breaks the old tool chain in new and different ways. Synthesis runs that used to take minutes now take hours. Layout runs that used to take hours now take days. Timing optimization becomes an endless iterative loop that never seems to converge. Management of routing resources becomes untenable…. Just about every aspect of our friendly FPGA tools gets broken with each new family and has to be re-engineered and repaired. If the tools aren’t working, and working well, it doesn’t matter how good a new FPGA family is. Nobody will be able to use it.
Unlike in the ASIC & COT market – the playing field is not level. For Altera and Xilinx, if their own tools cannot handle the design challenges posed by their chips, they are absolutely dead in the water. No EDA company will come riding over the horizon with third-party technology to save their bacon. If one competitor noses ahead of the other, it can be catastrophic for the loser. In the past couple of decades, we’ve seen this twice. Once, when Altera released their first set of “Quartus” tools – the issues were so bad it almost wiped out the company. However, Altera recovered and took a sustained lead with their “Quartus II” suite, and they have kept Xilinx playing catch-up ever since. Now, Xilinx is rumored to be re-engineering their entire tool suite from the ground up – a make-or-break play with enormous implications. It could literally determine the future of the company and its technology.
We are currently reaching a season of great change in the Valley of FPGAs. With the field of 28nm devices now just coming to market, new suites of tools slated to arrive, and fierce competitors attacking the fertile FPGA Valley from every side, we are at a pivotal juncture. FPGAs may begin a rapid expansion and join the ranks of processors and memories as assumed major components of every electronic system made, or they may retreat into niche applications where there are not more robust and usable alternatives. We’ll see what’s revealed when spring melts away the winter’s frost.
Will the green green valley of FPGAs survive the long cold winter? Or, will alternative solutions push programmable logic back to a niche?