feature article
Subscribe Now

Benefits and Tradeoffs of EDA in the Clouds

Only a few weeks after Motorola® launched the XoomTM, Apple® launched the iPad 2TM. These technological marvels, like their predecessors, illustrate several fundamental challenges that all design engineers face today: designs are getting more complex, and competition more fierce. As a result, the verification effort required to validate these designs is growing exponentially while the schedules are shrinking.  Consequently, design engineers’ jobs are getting much harder.

Companies must dramatically grow their capital investments for IT infrastructure to support this exploding demand for verification. This is costly, time consuming, and in many cases, is constrained by practical limitations, such as physical space, power availability, cooling capacity and IT support resources. This is in addition to the cost of the hardware itself, and in these times of tight budgets, most CFOs balk at increasing capital expenses (CAPEX).  It’s no wonder that customers tell us that verification is already the most expensive aspect of ASIC/SoC design. The dilemma from the compute infrastructure perspective is clear: support the growing verification demand with limited cost increases or take the blame for lengthening verification schedules.

While it’s obvious that larger and more complex designs need more verification throughput, this is not a static requirement.  Verification environments typically have usage peaks and valleys, and with some over-provisioning to support schedules, compute resources will be underutilized during valleys and overcommitted during peaks. Early in the design, engineering is the limiting factor as more bugs are found than can be fixed immediately. Later, as finding the remaining bugs becomes harder, larger simulations, longer individual tests and growing queues are inevitable. Infrastructure is then at 100% capacity and progress slows.

The above scenario is nothing new. For years, IT engineers have provisioned hardware with these variable demands in mind. While all companies would like to support verification peaks, this is now proving too costly and inherently inefficient, since provisioning for the worst case peaks means that for most of the project, the servers are underutilized.  But even the best laid plans can fall apart when the unexpected occurs – like a last minute bug.  Even if the fix is simple, the verification may require a full regression repeat, which can take days or even weeks. A schedule delay is virtually certain. So in addition to the normal peaks and valleys, engineers must plan for unexpected, last minute problems.

The ideal solution would be combining baseline provisioning to handle average verification loads with elastic and scalable access to compute resources able to quickly ramp up verification capacity to meet peak verification needs. In worst-case scenarios, rapid scalability would allow engineers to compress weeks of verification into a day or two.  Equally important, the solution should scale down when the demand subsides to keep costs in check. Is cloud computing the answer?

It is certainly true that cloud computing has the potential to satisfy scalability requirements. Cloud computing provides the ability to access hundreds of servers extremely rapidly.  It is also flexible, with the ability to turn off the servers instantly once they’re no longer needed. In fact, there are several interesting economic benefits associated with the adoption of cloud computing. In addition to handling last minute bugs while avoiding schedule delays, having virtually infinite resources means that engineers can easily compress schedules. If a week-long regression could be completed in a day, early market entry would be possible, which should lead to more revenue and higher market share. Cloud computing also requires no additional CAPEX because it is an operating expense (OPEX). CAD managers may then have the flexibility to spend more money on other needed resources.

But before a company opts for such a dramatic change to its EDA infrastructure, it must consider the following key tradeoffs.  These include:

  • Security: Cloud computing providers are likely more secure than the average enterprise customer.  As part of their business model, they undergo independent security audits regularly. It is therefore important to check for industry-accepted certifications such as ISO27001/27002, SAS 70 Type II and others. Cloud providers know they will have no business if the customers’ data is not secure.
  • Liability: No company is going to provide 100% liability against theft, and cloud customers must be prepared to accept this limitation. This is why cloud providers focus so much attention on security.
  • Corporate Policies: Many companies have policies on moving corporate IP offsite. These policies typically must be reviewed, and should be updated as needed.
  • Licensing: Today’s installed software licensing agreements don’t cover cloud computing. New licensing agreements will come into play, and the time and attention needed must be factored into planning.
  • Geography: Some countries have restrictions on technology exports. It is therefore important to work with cloud computing providers with sufficient global reach.
  • Automation: Extending an EDA environment to the cloud can be very straightforward, or depending on the customers’ requirements, may require an experienced partner to make the initial transition faster.

For EDA tools like verification, cloud computing represents the next paradigm shift. It offers the potential to deliver dramatic increases in verification throughput while simultaneously optimizing long-term costs. It can even offer a company the option of cost-effectively pulling in development schedules to meet more aggressive market and revenue goals.  With the right partner and a mature cloud provider, companies will be prepared to verify the largest designs – today and in the future.

Leave a Reply

featured blogs
Apr 17, 2024
The semiconductor industry thrives on innovation, and at the heart of this progress lies Electronic Design Automation (EDA). EDA tools allow engineers to design and evaluate chips, before manufacturing, a data-intensive process. It would not be wrong to say that data is the l...
Apr 16, 2024
Learn what IR Drop is, explore the chip design tools and techniques involved in power network analysis, and see how it accelerates the IC design flow.The post Leveraging Early Power Network Analysis to Accelerate Chip Design appeared first on Chip Design....
Mar 30, 2024
Join me on a brief stream-of-consciousness tour to see what it's like to live inside (what I laughingly call) my mind...

featured video

How MediaTek Optimizes SI Design with Cadence Optimality Explorer and Clarity 3D Solver

Sponsored by Cadence Design Systems

In the era of 5G/6G communication, signal integrity (SI) design considerations are important in high-speed interface design. MediaTek’s design process usually relies on human intuition, but with Cadence’s Optimality Intelligent System Explorer and Clarity 3D Solver, they’ve increased design productivity by 75X. The Optimality Explorer’s AI technology not only improves productivity, but also provides helpful insights and answers.

Learn how MediaTek uses Cadence tools in SI design

featured chalk talk

Trends and Solutions for Next Generation Energy Storage Systems
Sponsored by Mouser Electronics and onsemi
Increased installations of DC ultra fast chargers, the rise of distributed grid systems, and a wider adoption of residential solar installations are making robust energy storage systems more important than ever before. In this episode of Chalk Talk, Amelia Dalton, Hunter Freberg and Prasad Paruchuri from onsemi examine trends in EV chargers, solar, and energy storage systems, the role that battery storage integration plays in energy storage systems, and how onsemi is promoting innovation in the world of energy storage systems.
Jan 29, 2024
11,396 views