feature article
Subscribe Now

Electronic Elitism

DAC Divulges Design Tool Dilemmas

The 43rd annual Design Automation Conference (DAC) got underway yesterday in San Francisco, California. The technical sessions have begun, the exhibits are open, and the parties, PowerPoints and pejoratives have now commenced. Last night, at a media, analyst, and customer briefing dinner at the San Francisco Museum of Modern Art, Walden Rhines, Chairman and CEO, hosted a customer presentation explaining how Mentor’s new Caliber nmDRC accelerates nanometer design rule checking by “hyperscaling” – a technique that takes efficient advantage of multiple processing elements to deliver many times the previous performance in giant DRC runs.

Beyond promoting the polygon-pushing power of parallel processing, Mentor’s presentation also highlighted an interesting reality of today’s Electronic Design Automation (EDA) industry – nanometer class design tools are getting bigger, faster, more sophisticated, more expensive, and consequently, more exclusive. Among the customers heaping praises on Caliber nmDRC for its parallelizing prowess were AMD and Intel – hardly the novice class in semiconductor design.

It is no secret that the number of ASIC and COT (customer-owned tooling) design starts has declined steadily over the past several years and is forecast to continue declining for the foreseeable future. Along with the decline in the number of design starts, the number of companies and designers engaged in ASIC/COT design has declined as well. At the same time, the cost and complexity of ASIC/COT design and the sophistication of the tools and methods required to complete a current-generation chip has risen almost exponentially.

For the companies that develop tools for these designers, this is both an opportunity and a curse. The opportunity side is obvious – designers need extremely sophisticated tools to complete their projects, and they’re willing to pay top dollar to get them. For the company that can build the fastest, best-est, biggest super-deep submicron development and verification tools, the negotiation process is practically blank-check based. If you are in charge of a $15M-$30M SoC development project, how much would you pay for the software tools you need to help ensure that you will actually have working silicon at the end of your project? Well, I’ll wager that you’re not going to be choosing a weaker tool just to save a few thousand dollars. Spending a few extra thousand of the company’s money to help insure your personal job security tends to be an easy decision.

The curse side of the equation is probably more important, but less self-evident. While the promise of a big payday is obvious for developing “the world’s best” super-critical-something-or-other (SCSOO) that makes the difference between success and failure on high-budget SoC, the payday for developing an “almost good enough” or “second best” SCSOO is also easy to predict – near zero. Simply put, nobody wants to buy anything but the proven best technology when they’re gambling with their careers on a high-stakes SoC design.

All these trends make for an interesting game. For any given high-end EDA technology, we tend to end up with one of two scenarios: a monopoly with a single “winner” that develops and markets a clearly superior technology and several “losers” that attempt and fail to bring similar solutions to market, or a duopoly where two companies succeed in developing mostly indistinguishable solutions and thus share the resulting market. In the first case – to the victor go the spoils. A monopoly on a critical technology is worth tens to hundreds of millions of dollars in EDA. In the second case, however, an unfortunate variation on the famous “prisoner paradox” kicks in. With almost equivalent solutions, software companies tend to immediately jump into competing on price. Since the unit cost of their product is essentially zero and the development costs have already been borne, there is no practical limit to the price reduction a company can endure in order to win market share. With more than one competitor battling for the pie, the size of the pie diminishes dramatically. EDA companies engaged in such markets are lucky to even recoup the development and marketing costs of these unfortunate products.

For EDA management, this presents a dilemma. For any given new EDA product development effort there are three possible outcomes, and two of them are bad. EDA long ago realized this, however, and did some thinking of their own. It turns out that the cost of developing new EDA tools is almost negligible when compared with the cost of marketing, selling, and supporting them. EDA companies can start as many projects as they like with comparatively little risk, as long as they’re extremely careful which products they actually bring to market with full force.

In many instances, EDA has given up on even these low-investment speculative internal development projects. Instead, they keep a watchful eye on the packs of startup ventures regularly jumping into the design tool pool. You can start an EDA company with a laptop and a dollar, so there are typically droves of them to choose from. DAC, in fact, provides a veritable showcase of these new entries each year – creating an acquisition flea-market of sorts for big EDA. EDA companies bet on the struggling contenders like gambling addicts at a dog track, and often end up in bidding wars, paying undeserved premiums for perceived leaders in obscure technology areas.

In some critical technology areas, however, the startup farm is less fruitful. Some EDA capabilities are built on top of rich foundations of previous technology, and most startups don’t have the IP or the wherewithal to build the entire technological pyramid required to tackle issues like, for example, deep submicron design verification. In those cases, the EDA companies slug it out with generation after generation of internally developed tools, far from the protection of startup-based development independence, and with egos and inertia preventing them from backing down on long-fought competitive battles.

This game has proven the most dangerous of all, as it results in EDA companies spending ever-increasing sums developing ever-riskier products for smaller and smaller (but well funded) audiences. Increasingly, this part of the industry is evolving into an exclusive service for the electronics elite. Such a service does not necessarily make good business sense, however, and we’re beginning to see the technical tide changing as a result. Unfortunately, when you build products for a smaller and smaller audience, you begin to lose the advantage of economy of scale.

As the price and complexity of EDA products increases, EDA’s customers start to see advantages in developing technology themselves, either because they can create a competitive advantage by building a superior tool capability, or because EDA tools can no longer handle the complexity of their problems. In fact, an increasing number of the most advanced SoC developers have begun to do just that, and internally developed tools for high-end SoC design are on the increase. Note to Mentor – when you’re developing products for companies like AMD and Intel, and the key new capability you’re providing them is multi-processor acceleration of a computationally-intense algorithm, the line between customer and competitor can become perilously thin.

If we take this trend to its logical extension, the high-end tool zone will have made a complete (albeit three-decade) circle. ASIC tools began as internally developed technology at ASIC vendors in the early 1980s, then became a third-party industry as economy of scale gave rise to commercial EDA, and now may return to internal development as the economy of scale evaporates, leaving EDA to pursue greener pastures.

If so, where might those greener pastures lie? Many have placed bets on FPGA tools. For the past ten years or so, FPGA companies have played Robin Hood for design tool technology. They have taken from the wealth of tools and techniques originally created for well-funded ASIC design and distributed those tool technologies (re-spun for FPGA design) at a very low cost to the multitudes of design teams who sought refuge from the high-stakes game of ASIC by migrating to FPGA-based development. This “free-tool” model has long been a decisive obstacle blocking commercial EDA from investing heavily in FPGA tool development. With a few notable exceptions like Synplicity, Mentor Graphics, Aldec, and Australian-based Altium, the EDA industry has been reluctant to risk diving into the FPGA tank only to have their profits eaten by comparable tools supplied by FPGA vendors.

With the FPGA market itself being very close to a duopoly, the economy of scale has never materialized as it did in ASIC design tools, where there were a multitude of viable silicon suppliers competing. Unless the FPGA market diversifies further, it is unlikely that a highly competitive and lucrative third-party EDA market for FPGA-specific tools will emerge. In fact, today Xilinx and Altera, the two largest FPGA companies, would realistically need to be counted among the world’s largest EDA companies. If measurement were based on the number of installed seats of design tools, they might even very well be the biggest.

With FPGA companies grabbing the designs and design tool budgets of all but the largest, most complex product development efforts, EDA will need to find a new avenue for itself in case the current nanometer boom dries up. It is likely that attacking the design process and productivity at the system level, where combined hardware and software design have exploded in complexity in recent years, is the best direction for long-term, sustainable, technology-independent value creation. If so, watch for electronic system level design (ESL) to take an increasing share of the EDA market in coming years and an increasing share of air time in coming DACs.

Leave a Reply

Electronic Elitism

DAC Divulges Design Tool Dilemmas

Beyond promoting the polygon-pushing power of parallel processing, Mentor’s presentation also highlighted an interesting reality of today’s Electronic Design Automation (EDA) industry – nanometer class design tools are getting bigger, faster, more sophisticated, more expensive, and consequently, more exclusive. Among the customers heaping praises on Caliber nmDRC for its parallelizing prowess were AMD and Intel – hardly the novice class in semiconductor design.

It is no secret that the number of ASIC and COT (customer-owned tooling) design starts has declined steadily over the past several years and is forecast to continue declining for the foreseeable future. Along with the decline in the number of design starts, the number of companies and designers engaged in ASIC/COT design has declined as well. At the same time, the cost and complexity of ASIC/COT design and the sophistication of the tools and methods required to complete a current-generation chip has risen almost exponentially.

For the companies that develop tools for these designers, this is both an opportunity and a curse. The opportunity side is obvious – designers need extremely sophisticated tools to complete their projects, and they’re willing to pay top dollar to get them. For the company that can build the fastest, best-est, biggest super-deep submicron development and verification tools, the negotiation process is practically blank-check based. If you are in charge of a $15M-$30M SoC development project, how much would you pay for the software tools you need to help ensure that you will actually have working silicon at the end of your project? Well, I’ll wager that you’re not going to be choosing a weaker tool just to save a few thousand dollars. Spending a few extra thousand of the company’s money to help insure your personal job security tends to be an easy decision.

The curse side of the equation is probably more important, but less self-evident. While the promise of a big payday is obvious for developing “the world’s best” super-critical-something-or-other (SCSOO) that makes the difference between success and failure on high-budget SoC, the payday for developing an “almost good enough” or “second best” SCSOO is also easy to predict – near zero. Simply put, nobody wants to buy anything but the proven best technology when they’re gambling with their careers on a high-stakes SoC design.

All these trends make for an interesting game. For any given high-end EDA technology, we tend to end up with one of two scenarios: a monopoly with a single “winner” that develops and markets a clearly superior technology and several “losers” that attempt and fail to bring similar solutions to market, or a duopoly where two companies succeed in developing mostly indistinguishable solutions and thus share the resulting market. In the first case – to the victor go the spoils. A monopoly on a critical technology is worth tens to hundreds of millions of dollars in EDA. In the second case, however, an unfortunate variation on the famous “prisoner paradox” kicks in. With almost equivalent solutions, software companies tend to immediately jump into competing on price. Since the unit cost of their product is essentially zero and the development costs have already been borne, there is no practical limit to the price reduction a company can endure in order to win market share. With more than one competitor battling for the pie, the size of the pie diminishes dramatically. EDA companies engaged in such markets are lucky to even recoup the development and marketing costs of these unfortunate products.

For EDA management, this presents a dilemma. For any given new EDA product development effort there are three possible outcomes, and two of them are bad. EDA long ago realized this, however, and did some thinking of their own. It turns out that the cost of developing new EDA tools is almost negligible when compared with the cost of marketing, selling, and supporting them. EDA companies can start as many projects as they like with comparatively little risk, as long as they’re extremely careful which products they actually bring to market with full force.

In many instances, EDA has given up on even these low-investment speculative internal development projects. Instead, they keep a watchful eye on the packs of startup ventures regularly jumping into the design tool pool. You can start an EDA company with a laptop and a dollar, so there are typically droves of them to choose from. DAC, in fact, provides a veritable showcase of these new entries each year – creating an acquisition flea-market of sorts for big EDA. EDA companies bet on the struggling contenders like gambling addicts at a dog track, and often end up in bidding wars, paying undeserved premiums for perceived leaders in obscure technology areas.

In some critical technology areas, however, the startup farm is less fruitful. Some EDA capabilities are built on top of rich foundations of previous technology, and most startups don’t have the IP or the wherewithal to build the entire technological pyramid required to tackle issues like, for example, deep submicron design verification. In those cases, the EDA companies slug it out with generation after generation of internally developed tools, far from the protection of startup-based development independence, and with egos and inertia preventing them from backing down on long-fought competitive battles.

This game has proven the most dangerous of all, as it results in EDA companies spending ever-increasing sums developing ever-riskier products for smaller and smaller (but well funded) audiences. Increasingly, this part of the industry is evolving into an exclusive service for the electronics elite. Such a service does not necessarily make good business sense, however, and we’re beginning to see the technical tide changing as a result. Unfortunately, when you build products for a smaller and smaller audience, you begin to lose the advantage of economy of scale.

As the price and complexity of EDA products increases, EDA’s customers start to see advantages in developing technology themselves, either because they can create a competitive advantage by building a superior tool capability, or because EDA tools can no longer handle the complexity of their problems. In fact, an increasing number of the most advanced SoC developers have begun to do just that, and internally developed tools for high-end SoC design are on the increase. Note to Mentor – when you’re developing products for companies like AMD and Intel, and the key new capability you’re providing them is multi-processor acceleration of a computationally-intense algorithm, the line between customer and competitor can become perilously thin.

If we take this trend to its logical extension, the high-end tool zone will have made a complete (albeit three-decade) circle. ASIC tools began as internally developed technology at ASIC vendors in the early 1980s, then became a third-party industry as economy of scale gave rise to commercial EDA, and now may return to internal development as the economy of scale evaporates, leaving EDA to pursue greener pastures.

If so, where might those greener pastures lie? Many have placed bets on FPGA tools. For the past ten years or so, FPGA companies have played Robin Hood for design tool technology. They have taken from the wealth of tools and techniques originally created for well-funded ASIC design and distributed those tool technologies (re-spun for FPGA design) at a very low cost to the multitudes of design teams who sought refuge from the high-stakes game of ASIC by migrating to FPGA-based development. This “free-tool” model has long been a decisive obstacle blocking commercial EDA from investing heavily in FPGA tool development. With a few notable exceptions like Synplicity, Mentor Graphics, Aldec, and Australian-based Altium, the EDA industry has been reluctant to risk diving into the FPGA tank only to have their profits eaten by comparable tools supplied by FPGA vendors.

With the FPGA market itself being very close to a duopoly, the economy of scale has never materialized as it did in ASIC design tools, where there were a multitude of viable silicon suppliers competing. Unless the FPGA market diversifies further, it is unlikely that a highly competitive and lucrative third-party EDA market for FPGA-specific tools will emerge. In fact, today Xilinx and Altera, the two largest FPGA companies, would realistically need to be counted among the world’s largest EDA companies. If measurement were based on the number of installed seats of design tools, they might even very well be the biggest.

With FPGA companies grabbing the designs and design tool budgets of all but the largest, most complex product development efforts, EDA will need to find a new avenue for itself in case the current nanometer boom dries up. It is likely that attacking the design process and productivity at the system level, where combined hardware and software design have exploded in complexity in recent years, is the best direction for long-term, sustainable, technology-independent value creation. If so, watch for electronic system level design (ESL) to take an increasing share of the EDA market in coming years and an increasing share of air time in coming DACs.

Leave a Reply

featured blogs
Nov 22, 2024
We're providing every session and keynote from Works With 2024 on-demand. It's the only place wireless IoT developers can access hands-on training for free....
Nov 22, 2024
I just saw a video on YouTube'”it's a few very funny minutes from a show by an engineer who transitioned into being a comedian...

featured video

Introducing FPGAi – Innovations Unlocked by AI-enabled FPGAs

Sponsored by Intel

Altera Innovators Day presentation by Ilya Ganusov showing the advantages of FPGAs for implementing AI-based Systems. See additional videos on AI and other Altera Innovators Day in Altera’s YouTube channel playlists.

Learn more about FPGAs for Artificial Intelligence here

featured paper

Quantized Neural Networks for FPGA Inference

Sponsored by Intel

Implementing a low precision network in FPGA hardware for efficient inferencing provides numerous advantages when it comes to meeting demanding specifications. The increased flexibility allows optimization of throughput, overall power consumption, resource usage, device size, TOPs/watt, and deterministic latency. These are important benefits where scaling and efficiency are inherent requirements of the application.

Click to read more

featured chalk talk

Dependable Power Distribution: Supporting Fail Operational and Highly Available Systems
Sponsored by Infineon
Megatrends in automotive designs have heavily influenced the requirements needed for vehicle architectures and power distribution systems. In this episode of Chalk Talk, Amelia Dalton and Robert Pizuti from Infineon investigate the trends and new use cases required for dependable power systems and how Infineon is advancing innovation in automotive designs with their EiceDRIVER and PROFET devices.
Dec 7, 2023
60,528 views