feature article
Subscribe Now

Elephant in the Room

Synthesis and Place and Route Hold Keys to the Kingdom

I got a note a few weeks ago from an engineering student:  “Why does it take so long to compile my FPGA design?”  This twitter-esque brevity led me to semi-consciously fire off some half-helpful standard response, to which the student replied: “My other projects… seem to compile almost immediately, but the FPGA takes forever.”  A layer of the onion had peeled back.  This was a student who was approaching HDL as just another programming language.  To him, the step of synthesis-and-place-and-route was just another “compiler,” and he couldn’t understand why this compiler took so much longer to do its work than, say, gcc.

The idea of the big “go” button for logic synthesis probably originated with Ken McElvain and friends back at Synplicity in the 1990s.  Before that, logic synthesis for FPGAs was the idiomatic sibling to ASIC synthesis, where the tools had complex cockpits comprised of thousands of tuning options and hundreds of lines of custom tweaker scripts.  In that world, the use of synthesis tools was itself a black art – the purview of a select few engineers who had braved countless hours of trial-and-error and soaked up trillions of machine cycles to find a magic recipe that would yield something within an order of magnitude of hand-optimized schematic design starting from HDL code.

McElvain’s Synplicity took the bold move of building a synthesis tool especially for FPGAs (most FPGA synthesis was done by modified ASIC synthesis tools in those days), and, since most FPGA designers didn’t have the enormous experience base of their ASIC cousins, he decided to attempt to encapsulate all that tuning and tweaking into heuristics within the tool itself, providing the end user with a big “go” button that instructed the tool to take its best guess at a good solution while shielding the user from the incredible complexity of the synthesis process.  The result was phenomenal.  Scores of FPGA designers flocked to the Synplicity tools.  Except for the elite few engineers with extensive synthesis training, the ease-of-use of the one-button tool trumped the thousands of options of conventional tools.  Furthermore, as the money poured into Synplicity, the engineering energy went into optimizing the one-button flow.  Soon, the majority of users could get better results with the automated approach than they could navigating the plethora of processing options offered by other tools in search of a magic recipe.

FPGA synthesis turned out to be a much more complex problem than ASIC synthesis.  While ASIC gates were very close to the resources imagined by academic researchers – basic logic gates with varying input widths, the look up tables (LUTs) of FPGAs were much coarser-grained and more difficult to map using traditional algorithms.   Because of this, purpose-built FPGA synthesis tools like Synplify from Synplicity and Leonardo from Exemplar Logic quickly gained a substantial technology lead over re-purposed ASIC synthesis tools such as those from Synopsys, Cadence, and Magma.  FPGA synthesis had become its own arena.

With the concentration of FPGA synthesis technology in only two companies – Synplicity and Mentor Graphics (who had acquired Exemplar Logic and their Leonardo tool) – the big FPGA companies had a problem.  The most important technology in the delivery of FPGAs to the masses was owned by just two EDA companies, one of whom had only a partial interest in FPGAs.  These EDA-company synthesis tools were expensive (in the tens of thousands of US dollars), and therefore the barrier to entry for serious FPGA designers was very high.  Xilinx and Altera wanted to lower that barrier so that more engineers could afford to try FPGA design, and they certainly didn’t like third parties holding the keys to the FPGA kingdom.

Both FPGA companies set out with plans to remedy the situation.  First, they both did OEM deals with the EDA companies to distribute low-cost, reduced-capability versions of their FPGA synthesis tools as part of the FPGA companies’ standard tool suites.  These OEM deals offered attractive short-term payoffs to the eager EDA companies, but they also quickly saturated the market with low-cost tools, forcing EDA companies to compete with themselves for FPGA synthesis seats, and putting them in a position where they were able to sell only to the most demanding and well-funded designers and teams.  In addition to achieving the original goal of lowering the barrier to entry for FPGA design, these OEM deals had another benefit for FPGA companies.  They bought time.

The time that FPGA companies gained with the OEM deals allowed them to properly prepare their war machines.  Both Xilinx and Altera started FPGA synthesis development projects of their own – with much larger engineering groups than those at Synplicity and Mentor.  Xilinx acquired IST – a French logic synthesis company founded by Professor Gabrielle Saucier in 1992 – and re-branded the offering “XST.”  Altera quietly started developing logic synthesis as a built-in component of their Quartus development tool. 

Even thouh the FPGA companies were playing come-from-behind on FPGA synthesis, they had several advantages on their side.  First, each company could focus on optimizing their tool only for their FPGA architecture.  They didn’t have the overhead of producing general-purpose tools with the inevitable trade-offs that come from straddling multiple constraints from multiple architectures.  Second, their FPGA synthesis teams could influence the actual design of future FPGA architectures, where the EDA synthesis teams could not.  Third, their FPGA synthesis teams had the earliest possible access and the most detailed information about their company’s FPGA architectures.  Fourth, they had access to the EDA companies’ tools and benchmarked them regularly.  This gave them the advantage of a known, measurable target to work against.

For the EDA companies, the persistence of the FPGA company efforts choked off their funding and reduced their engineering budgets.  They were trying to maintain a narrow lead in tool capability with smaller engineering teams and more complex problems to solve.  Year after year, however, they appeared to succeed.  Commercial EDA synthesis tools – even today – generally give better results than those offered by the FPGA companies. 

But the gap is still closing.

The most serious complication for third-party FPGA synthesis came from Moore’s Law.  As FPGAs got larger (with smaller geometries), routing delays – as a percentage of total delay through a logic path – got larger.  In early FPGA architectures, logic delay was the dominant factor, and a reasonable (+/- 15%-20%) estimate of logic path delay could be made with a simple knowledge of the number of levels of logic between registers.  As the routing delay became the dominant factor, however, the layout of the chip, not the logical topology, became the determining factor in timing.  Synthesis tools now needed information on the placement and routing of the design (which happens after synthesis) in order to make reasonable estimates of routing delay for their optimization algorithms.  Place-and-route technology, however, was exclusively owned by the FPGA companies.

The FPGA companies made a token effort to support the physical synthesis efforts of the EDA companies.  They added interfaces to back-annotate or otherwise give synthesis tools access to placement data and estimated routing delays.  They even worked technology partnership agreements with the EDA companies for joint development of physical synthesis capabilities.  In the back room, however, their synthesis teams were still hard at work improving their optimization algorithms and integrating tightly with their own place-and-route tools.  Their enormous installed base gives them an unprecedented feedback channel from a huge variety of design projects all over the world, and their larger-than-EDA staffing levels (who don’t have to justify their salaries against quarterly tool revenues) give them a significant edge against their commercial counterparts.

The handwriting is on the wall.

In time, it seems almost inevitable that only FPGA company synthesis and place-and-route tools will survive.  EDA company profit margins for FPGA synthesis continue to dwindle.  Since Synplicity was acquired by EDA giant Synopsys, their movements have become harder to track.  While they still have a healthy stream of product releases and upgrades to the Synplicity products, Synopsys as a company has not demonstrated a great deal of commitment to or confidence in the FPGA market (and with good reason).  Mentor Graphics has also gradually gotten quieter with their FPGA messages over time.  Perhaps the big EDA companies are both wrestling with these same complexities in the FPGA tools market and trying to decide how to proceed.  Their history should tell them that a core enabling tool technology is the only way to score a “home run” with a third-party EDA tool.  The FPGA companies have done an admirable job capturing and defending the FPGA synthesis market, and they won’t give it up easily.  That single victory almost completely blocks EDA companies from any meaningful, high-growth, big-money participation in the rapidly growing FPGA market. 

If the EDA companies lose, there are other losers as well.  The smaller and startup FPGA companies, by-and-large, have not had the resources or the time to develop their own synthesis capabilities.  That means their fortunes are connected with those of third-party EDA synthesis tools.  However, their smaller audiences are not sufficient to carry the EDA companies to a robust, profitable business.  If EDA abandons FPGA synthesis, smaller FPGA companies like Lattice, Actel/Microsemi and others could be in deep trouble.

As FPGAs expand their role in the market, this monopoly on implementation technologies like synthesis and place-and-route becomes even more insidious.  While it is comparatively easy for a startup to come up with a new FPGA fabric architecture, make a deal with a merchant IC foundry, and start cranking out their own FPGAs, it is almost impossible to create a robust, commercially-viable synthesis and place-and-route tool suite – regardless of budget.  In addition to millions of dollars in engineering and years of calendar time to mature, synthesis and place-and-route enigmatically require lots of customers to evolve.  The proverbial infinite number of monkeys randomly banging on typewriters will write the complete works of Shakespeare in infinitely less time than it will take them to come up with working FPGA synthesis tools.

Since the two big FPGA companies evolved theirs with captive audiences during the boom days of the emergence of FPGA technology, the opportunity for willing and forgiving guinea pigs has passed.  Synthesis and place-and-route are both NP-complete computing problems, so there is no “magic bullet” algorithm that will yield superior results.  Development of these capabilities requires years of testing, evolution, and fine-tuning to monstrously complex software systems to reach the levels of capability we take for granted today.  If a company such as Intel, for example, wanted to enter the FPGA business in a meaningful way, they’d have little alternative but to acquire one of the existing big FPGA companies – specifically to acquire the core technology of synthesis and place-and-route. 

If FPGAs – or programmable logic technology in general – are a long-term important component of digital electronics, then synthesis and place-and-route have put just two companies – Xilinx and Altera – in the driver’s seat for the massive profit and growth that are possible in this market.  That’s a lot of responsibility for two mid-sized companies to handle. 

Dear engineering student,  Regarding your question about FPGA synthesis and place-and-route compared with gcc… see above.

12 thoughts on “Elephant in the Room”

  1. Pingback: 123movies
  2. Pingback: GVK BIO
  3. Pingback: pezevenk
  4. Pingback: colarts Diyala
  5. Pingback: roofing contractor
  6. Pingback: DMPK Biology Lab
  7. Pingback: look what i found

Leave a Reply

featured blogs
Nov 15, 2024
Explore the benefits of Delta DFU (device firmware update), its impact on firmware update efficiency, and results from real ota updates in IoT devices....
Nov 13, 2024
Implementing the classic 'hand coming out of bowl' when you can see there's no one under the table is very tempting'¦...

featured video

Introducing FPGAi – Innovations Unlocked by AI-enabled FPGAs

Sponsored by Intel

Altera Innovators Day presentation by Ilya Ganusov showing the advantages of FPGAs for implementing AI-based Systems. See additional videos on AI and other Altera Innovators Day in Altera’s YouTube channel playlists.

Learn more about FPGAs for Artificial Intelligence here

featured paper

Quantized Neural Networks for FPGA Inference

Sponsored by Intel

Implementing a low precision network in FPGA hardware for efficient inferencing provides numerous advantages when it comes to meeting demanding specifications. The increased flexibility allows optimization of throughput, overall power consumption, resource usage, device size, TOPs/watt, and deterministic latency. These are important benefits where scaling and efficiency are inherent requirements of the application.

Click to read more

featured chalk talk

RF Applications in Satellites
Sponsored by Mouser Electronics and Amphenol
In this episode of Chalk Talk, Maria Calia from Amphenol Times Microwave Systems, Daniel Hallstrom from Amphenol SV Microwave and Amelia explore the use of RF technologies in satellites, the biggest design concerns with connectors and cabling in satellite applications and why RF cable failure mechanisms can be managed by a careful selection of materials and manufacturing techniques.
Nov 12, 2024
9,303 views