feature article
Subscribe Now

Sub-threshold Design

Ambiq and PsiKick Chart a Challenging Path

We’ve been turning it down for years.

Energy consumption has gradually grown as a concern, to the point where it’s eclipsed performance as a primary driver for many circuits. To reduce power, you can do one of two things: turn down frequency (for dynamic power) or turn down the supply voltage. We’ve already stopped driving clocks as hard as we used to, what with the shift to multicore for scaling performance. But we’re still turning down the voltage.

The first move, where we took logic from 5 V, where it had been for years, to 3.3 V happened… a long time ago. Some components still use 3.3 V, but the leading-edge stuff is all down in the 1-point-something range. And drifting south.

But there’s a problem: We’ve had this “digital” approach to using a transistor. It has an “off” state and then some sort of “on” state. For analog, “on” means biased to some delicate point with feedback keeping all the poles where they’re supposed to be so that it’s stable. For digital, we just crank the crap out of it so that it’s on “hard.”

The thing that makes the difference between “off” and “on” is the threshold voltage. We used to assume that, below threshold, the thing was “off.” In more recent years, we’ve had to deal with the annoying reality that there is current that flows even in that region. This sub-threshold leakage has not been considered a good thing. But, all in all, we’ve come to grips with the fact that, kvetch though we may, the current doesn’t actually drop to zero until the voltage does.

But this has simply been treated as a second-order effect, muddying up our definition of “off” while still treating that regime as “off.” According to that paradigm, you can bring the voltage down only to the threshold, below which, by definition, you stop turning the transistor on. And a circuit that has no transistors “on” isn’t particularly useful.

Under those assumptions, we may have to stop turning the voltage down soon; there’s not much room left to move.

But in a less-visible, somewhat scary corner of the design world, a few folks have decided to re-examine these core assumptions. Because, really, we’ve got this cognitive conflict: the state we call “off” has this nasty lingering “on” behavior. So far, we’ve pushed the technology so that we can keep pretending that it’s an “off” state. But… what if, instead, we switched our thinking: if it has “on” characteristics – that is, if current flows – then why not consider it an “on” regime?

Looked at that way, the transistor is “off” only when the voltage is zero. Above that, there are three regimes: from zero to somewhere near threshold, a transition region right around threshold, and then the region we now think of as “on.” These are referred to as “sub-threshold,” “near-threshold,” and “super-threshold,” respectively.

 

Figure_1.png

 

There are at least two companies – Ambiq and PsiKick – that design circuits that reside completely in the sub-threshold regime. By today’s standards, such circuits never turn “on.” And yet they can do useful work at very low power.

Too insensitive and too sensitive

Let’s start by acknowledging that what we’re going to review here is hard. And the difficulty starts with two conflicting intrinsic characteristics of sub-threshold design as compared to super-threshold design. The first deals with defining “on” and “off” states for digital logic. It’s all about how much current flows in the transistor, and, for super-threshold design, the difference between “off” current and “on” current is many orders of magnitude. Down in the sub-threshold region? “On” current is only about 1000 times more than “off” current.

 

Figure_2.png

(Image courtesy Ambiq)

 

So right away, you have a circuit that’s not as sensitive to input swings, meaning you need a more sensitive detector.

But there’s another problem. The absolute value of these currents is strongly impacted by things like temperature and process. In other words, even though the “on/off” signal is insensitive, the responsiveness to external conditions is too sensitive.

 

Figure_3.png

 

The first means being at peace with the fact that, whatever the benefits of sub-threshold, it’s a pain in the tuckus. If a particular circuit, for whatever reason, won’t provide much of a power savings by going sub-threshold, then it’s just going to be easier to do standard super-threshold design.

In other cases, you may need to juice things up a bit to get the required performance. This could mean going into the near-threshold region or even all the way through to super-threshold.

So a given circuit is likely to be mostly sub-threshold, with a possible garnish of near- and super-threshold devices. And those devices will require a higher supply.

Both companies seem to be making more noise; Ambiq with a new whitepaper, PsiKick with an A series funding round.

You might think, given the pre-eminence of power as a criterion, that everyone would be jumping into this. But these two companies have spent years getting from idea to production simply because of the nitty-gritty details – from developing a well-characterized library of circuits to cobbling together a workable design flow to achieving good yield to logistics like finding a tester with a PMU (parametric measurement unit) that can measure nanoamps. Not for the faint of heart.

And my sense of it is that all sub-threshold design – analog or digital – has that fussiness that we usually associate only with analog. Along with that comes a reliance on experienced individuals with lots of design scars to validate what it took to get that experience. When I asked PsiKick what their “secret sauce” was, CEO Brendan Richardson listed four items:

  • A good understanding of sub-threshold digital systems
  • Strength in extremely low-power radios
  • Good knowledge of how to integrate those two together in a single chip
  • Skill in dealing with sporadic power sources (vs. the constant delivery of energy typical of power supplies and batteries)

These are very specific skills that inform the kinds of projects they focus on. Sub-threshold can have value for many different areas, but, for the moment, they’re leveraging what they know best.

I also sense a similarity to analog design when it comes to EDA tools. While digital has seen rampant abstraction, analog design is mostly done by hand – with the tools helping the designers to make or implement decisions. I have a sense that this characteristic will apply to sub-threshold digital design as well, to a certain extent. That’s not to say that these guys might not love some more love from their EDA tools, but whether, for instance, automatic layout from RTL would be possible remains to be seen.

The power levels that these techniques achieve are pretty amazing. But if it remains the domain of specialists, then Ambiq and PsiKick appear to be well positioned as the owners.

 

More info:

Ambiq Micro

PsiKick

 

8 thoughts on “Sub-threshold Design”

  1. Pingback: GVK BIO
  2. Pingback: DMPK Studies
  3. Pingback: TS Escorts
  4. Pingback: agen poker terbaik
  5. Pingback: satta matka

Leave a Reply

featured blogs
Nov 15, 2024
Explore the benefits of Delta DFU (device firmware update), its impact on firmware update efficiency, and results from real ota updates in IoT devices....
Nov 13, 2024
Implementing the classic 'hand coming out of bowl' when you can see there's no one under the table is very tempting'¦...

featured video

Introducing FPGAi – Innovations Unlocked by AI-enabled FPGAs

Sponsored by Intel

Altera Innovators Day presentation by Ilya Ganusov showing the advantages of FPGAs for implementing AI-based Systems. See additional videos on AI and other Altera Innovators Day in Altera’s YouTube channel playlists.

Learn more about FPGAs for Artificial Intelligence here

featured paper

Quantized Neural Networks for FPGA Inference

Sponsored by Intel

Implementing a low precision network in FPGA hardware for efficient inferencing provides numerous advantages when it comes to meeting demanding specifications. The increased flexibility allows optimization of throughput, overall power consumption, resource usage, device size, TOPs/watt, and deterministic latency. These are important benefits where scaling and efficiency are inherent requirements of the application.

Click to read more

featured chalk talk

Dependable Power Distribution: Supporting Fail Operational and Highly Available Systems
Sponsored by Infineon
Megatrends in automotive designs have heavily influenced the requirements needed for vehicle architectures and power distribution systems. In this episode of Chalk Talk, Amelia Dalton and Robert Pizuti from Infineon investigate the trends and new use cases required for dependable power systems and how Infineon is advancing innovation in automotive designs with their EiceDRIVER and PROFET devices.
Dec 7, 2023
59,315 views