After years of speed trumping everything else, we’re definitely in a period where we often give speed a back seat to power. Batteries are in; dead batteries are out. So, for a few years now, power analysis has rushed to catch up with timing analysis in the IC design tool chain.
Power analysis has never been a quick and easy deal – if you want high accuracy. We do much of our design at higher levels of abstraction so that we can do more in less time. But, ultimately, understanding power requires knowing what’s happening with each physical signal at any given time. And the only way to do that is to simulate the physical signals – that is, at the gate level – instead of simulating higher-level constructs that abstract those physical details away.
Problem is, gate-level simulation is slow. Which means that we have to decide both which windows of activity to simulate and which critical blocks we should focus on. Full simulation of the entire circuit for a complete scenario just isn’t possible with a tight delivery schedule.
In order to address this situation, a startup company called Baum has launched a new tool and approach to power analysis that they say will provide the same power analysis accuracy without the long run times associated with repeated gate-level simulations.
What they do is to create a proprietary power model of the circuit. This model encompasses gate-level information – but is used for RTL-level simulation. The model itself takes some time to create – 20 million gates will take about an hour to process into a model. The trick is that, given the model, analysis in conjunction with RTL simulation goes really quickly, so the one-time cost of model creation is paid back by substantial savings in simulation time.
(Image courtesy Baum)
Model creation is fed by the RTL and netlist, giving gate-level information and some level of correlation between the two. You also run a limited gate-level simulation (once, for model creation only) and use the FSDB (or VCD) as well. (If you wish, you can generate those files from RTL simulation instead.) Libraries feed in process information – in particular, switching current, leakage, and internal current. Parasitics (SPEF) and constraints (SDC) can also contribute, as an option.
The model starts with their PowerWurzel tool. There’s actually a bit of linguistically playful consistency going here: “baum” means “tree,” and “wurzel” means “root” (in German). So the PowerWurzel tool, they say, acts as a kind of foundation for the PowerBaum model generator. What that means in practice is that PowerWurzel acts as a gate-level power estimation tool, characterizing the circuit and feeding the data to PowerBaum.
From those inputs, PowerBaum creates the power model. That model interacts with the RTL simulation to create the power analysis output.
Model options
When building the model, you can opt for a full-circuit version, or you can tag various hierarchical nodes and generate specific models for those subcircuits. Multiple subcircuit models can be invoked during simulation; they will behave as if they were combined into a single model. You can also create a model for a partially completed circuit using behavioral models for the as-yet unfinished modules. That gives you a chance to get an early read on potential power issues without waiting for all blocks to be complete,
This brings some particular value to IP providers: they’ll be able to create an abstract power model of their IP block, giving their customers a tool for including the block’s impact on power without revealing the underlying circuit. The IP block model can then work with any other models included in the circuit to give accurate power results.
While the figure suggests that you can now do RTL-level simulation, you can simulate at any level of abstraction – even software. As long as the simulation environment provides a PLI connection, you’re good. The model queries the simulator each cycle and determines which signals should be included in a power calculation for that cycle. That signal selection process apparently isn’t obvious, and they consider it part of their secret sauce.
They say that the data exchange on each cycle happens quickly, so simulation time is impacted only slightly. It will still feel like RTL simulation, not gate-level simulation. And they can keep up with most any speed of simulation. Given the activity information from a given simulation clock cycle, the power information is reported separately to their tool (that is, it’s not reported back to the simulator – the simulator has no idea why it’s sending data over).
By running the actual power analysis scenarios at the RTL level, they claim that power simulations now run 300-5000 times faster, with accuracy of 95% or better as compared to full gate-level simulation. This makes it possible to run power analysis on a full circuit for a full scenario. Or multiple scenarios. No more guessing when the biggest problems will occur or which modules might present the biggest issues. Just run them all to be sure. If power looks to be too high in some portion of a scenario, you can dig into specific modules for more detail on where the power is going.
Proving that your circuit meets its power requirements is the obvious application of this tool. But they point out an additional less-obvious purpose for the tool: exploring whether current differences can act as a telltale, enabling so-called side-channel security attacks. That’s where an attacker learns to associate some physical phenomenon of a chip – like power – with “hidden” information like a security key. By knowing whether or not the power seems “random,” you can work to randomize it further if necessary.
This capability is, at least at present, available only for digital circuits. If you’re trying to analyze your analog circuits, you’re still going to need to run to the comfort of SPICE for that. When analyzing a mixed-signal circuit, they will let you input an average power for the analog portions so that you can consider the net power, even if it won’t have the same detailed accuracy.
More info:
What do you think of Baum’s approach to power analysis?
Or you could just use proper more models instead of 1/0s Verilog. Plain old Verilog doesn’t understand DVFS or body-biasing either. Baum’s approach is just a big hack.
Verilog-AMS will do that for you.
Better still, get yield numbers and catch CDC errors at the same time you get power – http://www.v-ms.com/ICCAD-2014.pdf