feature article
Subscribe Now

One More Step on the Path to AIIE (AI-In-Everything)

Almost anyone involved in developing new products today wants those products to have an artificial intelligence (AI) component [for the purposes of this column we will take AI to embrace machine learning (ML) and deep learning (DL)]. The problem is that AI is still very new in the scheme of things—everyone has heard about it, almost everybody uses it, but relatively few people know how to implement it.

A case in point is Original Equipment Manufacturers (OEMs), which are companies that make parts or products for other companies to use in their own products. OEMs typically specialize in one or more market segments, such as automotive systems or medical equipment.

Given a choice, most OEMs want to focus on creating the products themselves. They don’t want to have to learn how to create, train, and deploy AI models. All they want to say is “here is what I want to do” and to then be presented with the AI-on-a-chip to do it.

You may think this is a pipe dream (just the saying of which reminds me of the classic Animusic Pipe Dream video), but I’ve just been informed that this capability is a new reality.

I was just chatting with Mouna Elkhatib, who is the Co-Founder, CEO, and CTO of AONDevices. AONDevices was founded in 2018 by a group of industry veterans with an average of 20+ years expertise in audio processing and AI. They started by creating AI IP, which has already been licensed and used by Tier-1 customers. More recently, they’ve used their IP to create their own Edge AI chips in the form of the AON1100 (which is presented in a 24QFN chip package) and the AON1120 (which is presented in a 40QFN chip package).

Both of these devices include a RISC-V microcontroller unit (MCU), 2X neural processing units (NPUs), and a hardware digital signal processor (DSP). Both offer exceptional super-low power efficiency (<80µW in listening mode and <260µW at 100% processing). Both offer awesome performance and high accuracy in noisy environments (>90% hit rate at <0dB SNR with a single microphone). And both feature multi-sensor fusion, always-on speech and sound detection, classification, and speaker identification.

The AON1120 offers expanded capabilities over the AON1100, thereby enabling complex sensor fusion applications. For example, the AON1120 provides 16 general-purpose input/output (GPIO) pins, enhancing interface options for a broader range of sensors and peripherals. It also supports protocols such as I2S for audio capture and playback, along with SPI and I2C for sensor interfacing, facilitating the integration of diverse sensors like accelerometers, gyroscopes, and environmental sensors. Additionally, it includes a camera interface for super-low-power applications.

Now, you are probably saying, “This all sounds like stuff I’ve heard before. What makes this different to other Edge AI chips that combine an MCU and an NPU?”

“Great minds think alike,” as they say. (Of course, they also say, “Fools seldom differ,” but I’m sure that doesn’t apply to us.) “This all sounds like stuff I’ve heard before. What makes this different to other Edge AI chips that combine an MCU and an NPU?” I asked Mouna.

Well, let’s begin with the fact that most of the existing chips that boast the combination of an MCU+NPU use “off-the-shelf” IP, like an Arm Cortex MCU coupled with an Arm Ethos NPU, for example. By comparison, the folks at AONDevices have created all of their IPs from scratch, because—as Mouna says—this is the only way to achieve the lowest possible power.

When I say, “they created all of their IPs from scratch,” I really do mean all of their IPs, starting with the RISC-V MCU and their NPU and DSP, and including the I2S, SPI, and I2C interface IPs. All I can say is, “color me impressed,” and we haven’t even started yet. 

My next question was, “Why 2X NPUs? Why not just use the MCU with 1X NPU?” As Mouna explains, most products require some amount of sensor fusion, such as combining signals from a microphone with signals from a 3-axis inertial measurement unit (IMU). There are two ways to perform sensor fusion—either you fuse the data or you fuse the decisions.

Fusing the data and feeding it into a single NPU means creating a single model that can handle the data from multiple sensors. It turns out to be far more efficient to have light-weight models, each focused on their own detection and classification tasks, and then fusing the outputs from these models at the decision level.

Since the AON1100 and AON1120 are focused on applications running in the microwatt level, they use 2X NPUs, but future devices might extend this architecture by increasing the number of NPUs, each running its own small model.

How small? We are talking about only 50kB of coefficients (“weights”) per NPU/model. As Mouna says, “No OEM is going to use a low power chip if it doesn’t deliver the accuracy that is traditionally obtained with very large DSP algorithms or a large neural network. Our customers cannot compromise on accuracy. It’s extremely difficult to design a micro neural network with the accuracy of a large neural network. That’s what differentiates AON versus many other startups who are trying to build low power chips with in-memory compute.”

Do you remember earlier when I said: “Given a choice, most OEMs want to focus on creating the products themselves. They don’t want to have to learn how to create, train, and deploy AI models. All they want is to say ‘here is what I want to do’ and to then be presented with the AI-on-a-chip to do it.”? Well, this is where the AONx360 online tool suite comes into play.

AONx360 online tool suite (Source: AONDevices)

First, this suite contains a “zoo” of ML models for detecting and classifying things like audio, speech, gestures, motion, ultrasound, and more.

The way this works is that the customer collects and uploads their data into the cloud-based AONx360 environment (in the case of speech, the customer doesn’t even need to collect any data—all they need to do is to create a list of commands to be detected and identified, and the folks at AON can take it from there).

The AONx360 environment also features advanced data augmentation (with synthetic data) capabilities. Next comes adaptive training, followed by testing and verifying the performance of the model with the trained coefficients. After accessing and analyzing the results, the coefficients can be loaded into an NPU on the customer’s AON1100 or AON1120 chip.

It’s also important to note that there are two operating modes for these devices. First is a standalone mode, which doesn’t require any other processor in the system. This would be of interest for simple applications like toys. In this case, the chip boots from its on-chip ROM, the RISC-V MCU, configures everything from the on-chip Flash, and you’re “off to the races,” as it were. 

The second mode is of use for things like TV controllers, smartwatches, and headsets that already contain an application processor (AP). In this case, the AON chip communicates through standard interfaces allowing the AP to configure it. Then the AP goes to sleep, leaving the AON chip to monitor what’s happening. The AON chip only wakes the AP when required.

TV controller example (Source: AONDevices)

We have an Amazon Fire TV in our house. I love the speech control feature when I can say things like “Find xxx” or “Increase volume” or… you get the idea. I hate (well, strongly dislike) having to press the microphone button on the controller before issuing my voice command. Having to press a button sort of defeats the object of the exercise. 

So, you can only imagine my surprise and delight at the recent announcement that AONDevices and Atmosic Technologies have unveiled an always-on remote control, thereby facilitating intuitive user experiences. This bodacious beauty combines voice AI, sound detection, and speaker identification with ultra-low-power consumption.

Also of interest was the recent announcement that AONDevices has announced a revolutionary Edge AI sensor module in collaboration with P-Logic Consulting.

New AONix Edge addresses key challenges in Edge AI adoption (Source: AONDevices)

As always, there’s far more to this story—certainly far more than I can cover here—but you can learn more by reaching out to the folks at AONDevices. Also, if you happen to be visiting CES 2025, then make sure to visit the Atmosic suite and the Seltech suite, both at the Venetian, because both will have demonstrations featuring technology from AONDevices.

Well, I don’t know about you, but—as usual—I am awed and excited by all the developments currently taking place in AI space (where no one can hear you scream). It seems that each day I’m exposed to an innovative new concept or technology. In addition to the extremely low power aspects of the chips from AONDevices, I love the idea of OEMs (and others) simply being able to capture and upload data and receive a trained model in return. What say you? Do you have any thoughts you’d care to share on any of this?

Leave a Reply

featured blogs
Dec 19, 2024
Explore Concurrent Multiprotocol and examine the distinctions between CMP single channel, CMP with concurrent listening, and CMP with BLE Dynamic Multiprotocol....
Jan 10, 2025
Most of us think we know something about quantum computing, right until someone else asks us to explain it to them'¦...

featured video

Introducing FPGAi – Innovations Unlocked by AI-enabled FPGAs

Sponsored by Intel

Altera Innovators Day presentation by Ilya Ganusov showing the advantages of FPGAs for implementing AI-based Systems. See additional videos on AI and other Altera Innovators Day in Altera’s YouTube channel playlists.

Learn more about FPGAs for Artificial Intelligence here

featured chalk talk

Implementing Infineon's CoolGaN™: Key Essentials and Best Practices -- Infineon and Mouser
Sponsored by Mouser Electronics and Infineon
In this episode of Chalk Talk, Zobair Roohani from Infineon and Amelia Dalton explore the fundamentals and characteristics of wide band gap materials. They also investigate why the higher band gap and higher electric field withstanding capability of GaN brings us closer toward that ideal switching technology and the unique design challenges one must pay attention to when driving GaN devices.  
Dec 12, 2024
6,756 views