feature article
Subscribe Now

State of the Union – Addressed

Major Players Weigh in on FPGA

Since I was moderating the panel, and we know how these things tend to go, we had a pre-meeting before the conference to get the preliminaries out of the way:
Moderator (me):  OK, let’s get started with our FPGA panel discussion…
Exec #1:  My chips are faster than yours.
Exec #2:  No they aren’t, but they use way more power.
Exec #3:  That’s just because you’re not using our design tools.
Exec #4:  Nobody uses your design tools. Everyone uses ours because they’re much more powerful.
Editor’s Note:  This conversation is completely fictional and is included for entertainment value only.  It never happened.  It is not representative of any real-life FPGA or EDA companies or their executives.  We promise.  Will those black helicopters please stop circling our office now?

Really, what could you expect executives from four companies simultaneously engaged in both intimate technical cooperation and bitter competition to agree on?  The answer, most surprisingly, is “almost everything.”

Let’s set the stage:  For the past decade or so, two companies – Xilinx and Altera, have dominated the FPGA/CPLD industry with a cumulative market share probably somewhere north of 70%.  Synplicity and Mentor Graphics have been the only two traditional EDA companies that have made and maintained significant investments and engineering efforts focused on FPGA-specific tools.  Together, they too account for something greater than 70% of the commercial tool market for FPGA design.

It shouldn’t surprise anyone that both of these pairs of companies have nurtured rivalries that rival Michigan and Ohio State, The Yankees and Red Sox, Everton and Liverpool, and Ferrari and McClaren.  What might be a surprise is the cross-technology competition among partners that has also emerged due to the fact that FPGA companies develop and distribute powerful design software that duplicates many of the capabilities of commercial EDA tools – essentially for free.  EDA vendors have never been too happy about trying to make a business competing with products that are almost given away, and FPGA companies are leery about trusting their technical fates to companies that are working closely with both them and their strongest competitors.

This is all old news, of course, but it forms the invisible tapestry we all see so clearly behind events like the DesignCon executive forum.  Our topic was “Executive Views on the Turbulent FPGA Landscape”… No kidding!  What these astute high-tech execs revealed over the course of the two-hour discussion, however, is that technology challenges know no boundaries, and the obstacles that face the broader market adoption of FPGAs trump inter-company efforts at differentiation.  It seems that everyone would be happier with a good slice of a bigger pie rather than a bigger slice of the same pie.

The good thing about the FPGA market – everyone seemed to agree – is that it’s growing.  It’s growing in just about every dimension. The numbers of design teams using FPGAs in their products is increasing rapidly as FPGAs move into new market segments in the consumer, automotive, and industrial areas.  The raw number of devices shipped is increasing dramatically as high-volume “killer apps” such as programmable display drivers for flat-panel and LCD displays pull enormous numbers of lower-cost chips from inventory.  EDA companies’ revenues for FPGA design tools are up, and the number of customers continues to climb.  The total silicon revenues for FPGAs are also at an all-time high, finally eclipsing their pre-collapse numbers from the telecom bubble that burst in 2001. 

This FPGA market growth comes at the expense of other design methodologies such as ASIC (and customer-owned tooling ‘COT’ ASIC), which are on a steady run of decreasing design starts.  For FPGA companies, this is sunshine and roses – an opportunity to bring their technology into green fields rife with opportunity.  For EDA companies, this is a mixed blessing – an opportunity in a thriving and growing market tainted by the threat of cheap competition against the backdrop of the steady decline of a franchise-building ASIC tool business.

In this discussion, however, the focus was on common enemies deployed by the evil armies of Mr. Moore’s relentless Law.  Design complexity is up dramatically, forcing tools to get bigger, faster, and more capable.  To underscore this point, Misha Burich, Senior Vice President of R&D at Altera, pointed out that the largest Cyclone III device (a 65nm “low cost” FPGA) is now about 50% larger than the biggest “high-end” FPGA was just five years ago.   In response to this trend, legacy design methodologies like pure HDL-based design are being replaced by higher-leverage, greater-productivity approaches like IP-based design and ESL.  While ESL has often been maligned in the press with quips such as “ESL is the design technology of the future… and always will be,” all four companies represented at this panel are engaged in significant development and deployment of ESL technology.  In commenting on the definition of ESL, Simon Bloch – General Manager of Mentor’s Design and Synthesis Division, said “ESL is anything that gains you 10x-100x in design productivity.”  Perhaps, then, our collective problem with ESL has always been with its definition rather than its viability.

One of the first targets for ESL-like methodologies in FPGAs is DSP design.  DSP algorithms offer an opportunity for FPGAs to shine, demonstrating their ability to massively parallelize datapaths and literally annihilate conventional DSP processors at high-throughput applications in just about every dimension from cost to power consumption to footprint to – uh oh… not productivity and ease of programming.  That is where ESL methodologies help level the playing field.  By making FPGA “programming” for DSP more comparable in complexity to software-based programming of DSP chips, ESL makes hardware-phobic DSP design teams more likely to take the plunge and jump into programmable logic.  In this case “ESL” could be anything from model-based design through something like Synplicity’s DSP Synthesis tool combined with The Mathworks Simulink environment to direct compilation of algorithmic C++ into optimized hardware using something like Mentor Graphics Catapult C. 

Moore’s Law leaves other artifacts behind as well, and Andrew Dauman, Senior Vice President, Worldwide Engineering at Synplicity, pointed out that the adoption of physical synthesis technology for FPGA implementation is increasing rapidly.  Like ESL, every company at the table has some form of physical synthesis or combined logical and physical timing optimization on the market, so the argument is easy to swallow.  With faster designs come tighter timing budgets, and those give rise to the kind of problems that can reasonably be addressed only with physically-aware logic design tools.

Much bigger than a blip on everyone’s radar was the topic of power.  While power optimization is a hot topic these days with everyone from Al Gore to mobile device manufacturers, FPGAs are particularly vulnerable to power concerns.  For most FPGAs, the act of simply staying alive and configured means keeping power applied to bazillions of configuration transistors, and as process geometries shrink, the leakage current through all those SRAM-like cells explodes.  At the same time, that explosion is multiplied by the same factors by which we increase our device density.  It’s not a pretty picture. 

Rajeev Jayaraman, Senior Director of Implementation Tool Development at Xilinx, agreed that power has been a major issue for FPGA vendors for years.  Going back to the development of their 90nm Virtex-4 technology, Xilinx began using variable oxide thicknesses so that transistors such as those in the configuration path that don’t require rapid switching can be implemented with low-leakage gates while the thin oxides are reserved for things that toggle rapidly.  Archrival Altera, of course, has brought their own leakage-mitigation methods to the table, and even EDA companies are adding features that help combat the static current monster.

Power optimization for FPGA is considerably more challenging than in ASIC technologies —  first, because of the aforementioned mass of configuration logic that isn’t present in ASIC devices, and second, because FPGAs require specific hardware architectural support for most of the “tricks” that have been used in ASIC to reduce power consumption.  If you want to apply dynamic power reduction techniques like voltage and frequency scaling or clock tree disabling in your FPGA design tool, you’ll need some special support in the FPGA architecture in order to accomplish that.  This is one of the reasons that the four companies at the table in this panel have been seeing a lot more of each other lately.  Solving the power problem with FPGAs is in everyone’s interest, and it cannot be accomplished by anyone on their own. 

Some may have been disappointed in the abject agreement of the panelists assembled for this event.  After all, what’s more boring than showing up to watch a good brawl and having everyone shake hands and make nice.  From an FPGA technology point of view, however, it couldn’t have defined a clearer roadmap:  the demand for programmable solutions is rapidly increasing, advanced design tool capabilities are being developed to cope with the dark side of the gifts of Moore’s Law to FPGA technology – particularly power consumption, and FPGAs will definitely be paired with massive productivity-enhancing tool advances such as ESL.

Leave a Reply

featured blogs
Nov 22, 2024
We're providing every session and keynote from Works With 2024 on-demand. It's the only place wireless IoT developers can access hands-on training for free....
Nov 22, 2024
I just saw a video on YouTube'”it's a few very funny minutes from a show by an engineer who transitioned into being a comedian...

featured video

Introducing FPGAi – Innovations Unlocked by AI-enabled FPGAs

Sponsored by Intel

Altera Innovators Day presentation by Ilya Ganusov showing the advantages of FPGAs for implementing AI-based Systems. See additional videos on AI and other Altera Innovators Day in Altera’s YouTube channel playlists.

Learn more about FPGAs for Artificial Intelligence here

featured paper

Quantized Neural Networks for FPGA Inference

Sponsored by Intel

Implementing a low precision network in FPGA hardware for efficient inferencing provides numerous advantages when it comes to meeting demanding specifications. The increased flexibility allows optimization of throughput, overall power consumption, resource usage, device size, TOPs/watt, and deterministic latency. These are important benefits where scaling and efficiency are inherent requirements of the application.

Click to read more

featured chalk talk

Advantech Dual Band WiFi
Sponsored by Mouser Electronics and Advantech
In this episode of Chalk Talk, Amelia Dalton and Monica Goode from Advantech investigate the what, where, and how of dual band WiFi. They also explore the benefits that dual band WiFi can bring to a variety of embedded designs and how you can take advantage of Advantech dual band WiFi solutions for your next design.
Jul 31, 2024
84,015 views