feature article
Subscribe Now

Ambiq Apollo4 Undercuts IoT Power

Unique Subthreshold Voltage Drops MCU into Microwatt Range

“Whoever is new to power is always harsh.” — Aeschylus

Ambiq has a new microcontroller. Of all the MCUs you could pick for your next IoT design… this is one of them. 

If that sounds like damning with faint praise, I don’t mean it that way. It’s actually a very nice part, as we will see. It’s just that the market for low-cost MCUs is very crowded. It’s hard for one vendor – much less one specific chip – to stand out. There are only so many ways you can slice, dice, and fine-tune this fiercely competitive segment. 

First, the basics. Ambiq is a smallish (135-person) fabless chip company based in Austin. It’s been around for 10 years, so it’s hardly a startup, even if it’s staffed like one. The company’s charter (do companies still do that?) is to make super-low-power MCUs for battery-powered devices. 

That’s all very nice, but it also pretty much describes a dozen other IoT-MCU wannabes. What makes Ambiq so special? The company’s USP (unique selling proposition, for salespeople) is spelled SPOT, which stands for subthreshold power optimized technology. In short, Ambiq designs circuitry that switches at very low voltages, around 0.3V, which is well below the level that most digital circuits would consider to be “off.” Ambiq has been productizing its SPOT-based devices for years now, so the spooky technology is now well-characterized. And it makes for good MCUs. 

The new MCU is called Apollo4 and it is, not coincidentally, the 4th generation of the company’s Apollo family. This and all previous generations are based on ARM’s Cortex-M4F CPU design. M4F is a midrange microcontroller with a floating-point unit, and in Apollo4 it runs at up to 196 MHz. That’s about twice as fast as Apollo3, and about four times faster than Apollo2. 

Surrounding this familiar CPU is a plethora of peripherals: UARTs, SPI, I2C, ADC, timers, random-number generator (RNG), audio outputs, crypto accelerator, GPIO pins, and plenty more. The CPU itself is bordered by 64KB of instruction cache and 384 KB of either data cache or tightly coupled memory (TCM), a Cortex hallmark. It wouldn’t be a standalone MCU without on-chip memory, so Apollo4 has 2MB of MRAM (magnetoresistive RAM) and another 1 MB of SRAM. 

What makes Apollo4 interesting (aside from its low-power technology) are its display controller and its optional Bluetooth LE interface. The display resolution stretches only to 640×480, so it’s not videogame quality, but it does support 32-bit color. Creating a convincing and customer-pleasing display is more about color depth than resolution, so Apollo4 should be more than good enough for smart refrigerators, home security systems, health monitors, and other GUI-rich devices. 

The BLE baseband modem and RF stage are optional and are there primarily for location finding through the spiffy angle-of-attack/angle-of-departure (AoA/AoD) feature of BLE 5. You can also use it to connect to other smart devices (i.e., watches). What Apollo4 doesn’t offer is any other type of wireless interface. There’s no Wi-Fi, Thread, Zigbee, or other interfaces that seem like obvious candidates for an IoT-focused MCU. Ambiq says its choice of BLE or nothing allows designers to choose their own interface using off-chip logic, including Wi-Fi, et al., which would add cost to the chip and almost guarantee that some interfaces would go unused. A small company like Ambiq can’t afford to maintain too many different SKUs. 

So, how low-power is this new low-power device? That’s tough to measure, since it depends on so many variables. Ambiq says that Apollo4 consumes just 3 µA per MHz (10 µW/MHz at 3.3V), which works out to about 2mW at top speed. Sounds good. Ambiq’s previous generation Apollo3 consumed twice that much power, and the generation before that (Apollo2) ate up three times as much. So, even by Ambiq’s standards, Apollo4 is an extremely low-power device. 

Apollo4 is sampling now, so it’ll be a while before we see it in everyday devices. You’ll know how to find them. They’ll be the ones with the longest battery life. 

Leave a Reply

featured blogs
Nov 15, 2024
Explore the benefits of Delta DFU (device firmware update), its impact on firmware update efficiency, and results from real ota updates in IoT devices....
Nov 13, 2024
Implementing the classic 'hand coming out of bowl' when you can see there's no one under the table is very tempting'¦...

featured video

Introducing FPGAi – Innovations Unlocked by AI-enabled FPGAs

Sponsored by Intel

Altera Innovators Day presentation by Ilya Ganusov showing the advantages of FPGAs for implementing AI-based Systems. See additional videos on AI and other Altera Innovators Day in Altera’s YouTube channel playlists.

Learn more about FPGAs for Artificial Intelligence here

featured paper

Quantized Neural Networks for FPGA Inference

Sponsored by Intel

Implementing a low precision network in FPGA hardware for efficient inferencing provides numerous advantages when it comes to meeting demanding specifications. The increased flexibility allows optimization of throughput, overall power consumption, resource usage, device size, TOPs/watt, and deterministic latency. These are important benefits where scaling and efficiency are inherent requirements of the application.

Click to read more

featured chalk talk

Accelerating Tapeouts with Synopsys Cloud and AI
Sponsored by Synopsys
In this episode of Chalk Talk, Amelia Dalton and Vikram Bhatia from Synopsys explore how you can accelerate your next tapeout with Synopsys Cloud and AI. They also discuss new enhancements and customer use cases that leverage AI with hybrid cloud deployment scenarios, and how this platform can help CAD managers and engineers reduce licensing overheads and seamlessly run complex EDA design flows through Synopsys Cloud.
Jul 8, 2024
36,358 views