editor's blog
Subscribe Now

Cleaning Up the Verification Shop

It’s one thing if different tools from different divisions of the same company don’t talk seamlessly together. Generally considered poor form. While that used to be common, EDA folks have cleaned that up a lot over the years.

It’s generally better accepted when tools from one company don’t necessarily integrate well with tools from another company. If there are good strategic reasons, it will happen. If not, then, as a designer or EDA manager, you’re on your own for patching the tools together.

But what about when, as a company, you go on a multi-year shopping spree? Now tools that used to be made by different companies have magically transformed into tools from different – or even combined – divisions within the company. So what might have looked tolerable amongst multiple companies starts to look messy within a single company.

Of course, we know who our intrepid EDA shopper is: They of the Endlessly Open Purse, Synopsys. They recently announced that they are bringing their various verification technologies together under the unified moniker “Verification Compiler.” This unites, to a degree,

  • Static and formal analysis
  • Simulation
  • Coverage management/analysis
  • Verification IP
  • Debug

The nature of how this comes together seems to have a couple forms, and more is yet to come. To a certain extent, this is a packaging/licensing thing, where what used to be separate products can now be purchased and managed together as a bundle.

From an outside user’s view, however, you will still run the tools as you always did – this isn’t an integration into a seamless, consistent, unified GUI – although that’s the part that’s likely to come in the future. For now, use models will remain similar.

But it’s not only a marketing thing you can learn more if you read here. Underneath, these tools have had engines upgraded, and, in particular, they have been made to talk much more efficiently to each other using native integration rather than slower, less efficient (but more portable) approaches like PLI. The entire suite of tools can be scripted into a unified flow, rather than the current situation where each tool has a distinct flow.

The big win here thanks to these nuts-and-bolts improvements is performance. They post some pretty impressive gains – summarizing them as being 5 times faster (yielding 3 times the productivity). One formal project run by an unnamed customer ran 21 times faster. Capacity has also improved – in some cases by as much as 4 times.

One important message in the face of this inter-tool bonding: Verdi is remaining open. You may recall that one of the items in Synopsys’s shopping cart was SpringSoft, and the Verdi debug tool has a popular open interface and ecosystem. Even though they’re tightening their internal integration with Verdi, they’re not closing off access to outsiders.

In case you’re bringing out your checkbook right now, heads-up: unless you are amongst the anointed, you probably can’t get it yet. This is targeted for end-of-year broad availability; for now, it’s being wrung out by “limited customers.” I’ll leave it to you and Synopsys to decide whether you’re one of them.

And you can find out more about this in their release.

featured blogs
Nov 22, 2024
We're providing every session and keynote from Works With 2024 on-demand. It's the only place wireless IoT developers can access hands-on training for free....
Nov 22, 2024
I just saw a video on YouTube'”it's a few very funny minutes from a show by an engineer who transitioned into being a comedian...

featured video

Introducing FPGAi – Innovations Unlocked by AI-enabled FPGAs

Sponsored by Intel

Altera Innovators Day presentation by Ilya Ganusov showing the advantages of FPGAs for implementing AI-based Systems. See additional videos on AI and other Altera Innovators Day in Altera’s YouTube channel playlists.

Learn more about FPGAs for Artificial Intelligence here

featured paper

Quantized Neural Networks for FPGA Inference

Sponsored by Intel

Implementing a low precision network in FPGA hardware for efficient inferencing provides numerous advantages when it comes to meeting demanding specifications. The increased flexibility allows optimization of throughput, overall power consumption, resource usage, device size, TOPs/watt, and deterministic latency. These are important benefits where scaling and efficiency are inherent requirements of the application.

Click to read more

featured chalk talk

Power Modules and Why You Should Use Them in Your Next Power Design
In this episode of Chalk Talk, Amelia Dalton and Christine Chacko from Texas Instruments explore a variety of power module package technologies, examine the many ways that power modules can help save on total design solution cost, and the unique benefits that Texas Instruments power modules can bring to your next design.
Aug 22, 2024
43,014 views