Seems like verification unification is in the air. We saw it recently with Synopsys, and now we have a move from Mentor.
While Synopsys’ version looked like an effort to unify acquired technology, Mentor’s efforts seem more internal. The big picture involves the unification of simulation, formal, emulation, and virtual prototyping under one umbrella, one interface. In that scheme, Mentor presents each of the technologies as an engine serving the higher-level verification goal; no longer is each one of these things a separate tool.
But a big part of what’s happening here is about conjoining emulation and simulation more seamlessly – trying to unify them to a greater extent. We actually looked at Cadence’s version of this some time back, when the verification environment was even more fractured and confusing. But the high-level picture of this relates to making the distinction between simulation and emulation more transparent.
In concept, emulation should just be a faster simulation engine, and you should be able to push the pieces of your design around between the simulator or the emulator – or the virtual prototype – based on what needs to be verified in the greatest detail and how many cycles are required either to run the tests themselves or to support the tests (like getting past the boot-up sequence quickly).
In practice, of course, emulators are hardware, and so only so much of the testbench can move from a virtual environment into real hardware – the so-called synthesizable subset. That requires care on the part of verification engineers to support a flexible testbench, and it’s also the point of new verification IP (VIP) that Mentor has included as a part of this announcement – VIP that transitions more easily between simulation and emulation.
So a big part of what’s new here is in Mentor’s Veloce emulator: their new OS3. And there are several new Veloce capabilities that are important for supporting this unification:
- It supports a more simulation-like interaction.
- Assertions can now be synthesized.
- It now tracks coverage.
- It supports Mentor’s push to move emulators out of the lab and into the data center for more effective sharing and better machine utilization, including multi-tenanting on a single machine.
Two supporting tools help with this. One is VirtuaLAB, which, somewhat surprisingly, was presented as new, but which we actually saw almost exactly two years ago. This is about eliminating rate matchers when generating “real-world” stimulus for verification. The VirtuaLAB boxes can also go into the data center as general stimulus generators, eliminating the need for someone to be physically present in a lab connecting wires to get data.
The other supporting tool is CodeLink, which we saw quite some time ago. While it’s supported offline simulation debug all the while, it now supports offline emulation debugging and review as well.
There’s actually a subtle consideration here for designs underway on existing Veloce machines. In theory, if you migrate your emulators into a data center and start sharing them on the new OS version, it’s likely this will happen in the middle of some design (it’s impossible to imagine a big company where all projects magically finish at the same time, creating an opportune window for change). The thing is, making changes in the middle of a design project is generally not great for schedule confidence. But Mentor assures us of full backwards compatibility, so that verification plans being executed under older expectations should work just as if nothing had changed.
Meanwhile, they’ve announced a new unified debugger called Visualizer that supports all of the engines, removing the need to move between debuggers when moving between engines.
And, in another trend, simulation results are all stored in a single database, regardless of which engine generated them.
This whole unification movement reflects what’s happening on chips themselves: SoCs now integrate pieces that, in earlier times, would have been created, verified, and debugged separately. And with smaller chunks, you could use separate tools for separate parts of the verification plan. But that’s just not feasible now that every aspect of every circuit has to be known to work properly before cutting an outrageously expensive mask set.
You can get more info in their announcement.