feature article
Subscribe Now

EDA for Sensor Fusion

Movea’s SmartFusion Studio

When you’re building something that’s never been built before, you’ve got a lot more work in store for you than you would if you merely satisfied yourself with what’s already been done. There are lots of unknowns, and you’ve got to explore and solve them before you can proceed.

If you’re doing an everyday human-sized project in your workshop to build something using unusual materials, you have to figure out what adhesives or other fasteners will work, whether any of the materials might interact (aluminum touching iron?), whether paints will cover adequately… You might have to try different variants or different formulations until you get something that performs well.

The good news here is that you’re working at a human scale. You can get bits and pieces of materials and paints and glues and whatnot, and you can easily experiment to see how things work.

At the micro- and nano-scale, if you’re designing a new integrated circuit, you may also have to experiment. Of course, here, you can’t work with your hands to do it. Instead, you have a rich set of EDA tools and libraries and models that can help you through the experimentation.

But what if you’re trying to assemble motion algorithms based on the data being generated by sensors? Each sensor is a bit different; different algorithms and filters have different effects and effectiveness, and the underlying math can be daunting.

Yes, you could argue that navigating this maze is part of the price of greatness in the MEMS world. Then again, anyone who thought that manually cutting rubylith was one of the rites of righteousness in the chip world would be feeling pretty unrighteous anymore.

The truth is that, as sensors proliferate, designers are going to want to skip over the boring bits and get right to a design that works. Or, perhaps it’s better to flip this idea around: in order for sensors to proliferate, designers have to be able to use them without understanding what’s going on under the hood, just the way you drive a modern car that you probably can’t maintain yourself because it’s too complex (and if you can, you’re just being stubborn).

That’s the direction Movea is trying to head with their evolving SmartFusion Studio. They unveiled the tool earlier this year, but it’s still a work in progress. It’s been an ongoing project for many years, since they’ve been using it themselves for their own algorithm development, and now they’re moving towards making it available – with customer-friendly features – to outsiders.

The idea is to be able to configure an arrangement of sensors, calibrate them, and then apply algorithms to see how a fused output responds. It can work with Movea’s atoms and molecules; they plan to have a drag-and-drop capability for those and other elements in the future.

I got to see a demo with the drag-and-drop capability in place. They were working with a Nexus phone communicating over WiFi with the tool on a host PC. They captured the motion in a file that could be replayed in the tool to evaluate how algorithms affected that motion. In a really simple example, the raw motion – or even lack of motion – was highly jittery. A Butterworth filter (one of a large array of filters available, making me wonder how you would know which ones to try) was dragged onto the pitch; the design was recompiled (critically, it took only a second or two); and now the motion was nice and smooth.

The idea is pretty straightforward; sort of like a Matlab for sensor fusion. Lots of options to experiment with, and, with a quick compile, it’s not expensive to do so. There is a broad collection of basic viewers, and Movea can help to develop custom “slides,” which are essentially views that correspond to a specific application. But it has its own set of complicating factors to deal with.

Calibration, for example. For combo sensors, the manufacturer will typically provide the calibration information. For other sensors, you might have to calibrate it yourself – especially early in the life of the sensor. The tool takes you through a calibration procedure so that you can fill in the necessary coefficients. Later in the maturity cycle, a default set of calibration values will have been agreed on, so at that stage, you might be willing to live with that.

What platform resources you can access depends on how a platform is configured and what your development role is. Platform developers, who need to see low-level sensor details, will install the tool at the root level to bypass the protections and abstractions that the OS provides (or imposes, depending on your view). Apps developers, on the other hand, will be coding for a fixed, finished platform, meaning they have to work above the OS. For these guys, it’s impossible to access the low-level stuff unless the platform maker has provided some end-around to get to low-level data through a custom API.

The design flow itself will depend on who is doing the work. Platform developers will be capturing low-level sensor outputs in a file and replaying them in the tool, compiling for integration into the embedded application. App developers, on the other hand, will be developing at the OS API level, relying on the services provided by the embedded infrastructure and resulting in a standalone app.

For platform developers, the tool can be configured to run the software on the application processor or a sensor hub. Configurations for a variety of processors and sensor hubs are available.

There’s also the question of which fusion algorithm to use. Some sensors come pre-equipped with hard-coded algorithms; others don’t. And if your sensor has, for instance, a built-in tap function, then why would you do that up at the software level?

Well, if you’re going to be in a phone or tablet, you’re going to have to be willing to share the stage: those guys demand at least two vendors for any component. Problem is, different vendors do things differently, so one guy’s tap function won’t be like another guy’s. Which means using an agnostic algorithm at a higher level that will abstract away the sensor differences.

Exactly how this product will be bundled and sold hasn’t been completely decided yet. As we’ve seen, there are two different audiences they’ll be targeting. The first will be the phone manufacturer. They’ll be developing the low-level drivers that the OS will rely on. These guys need to dive deeper, and, by definition, they have flexibility in the platform because they can make changes if necessary.

Later, Movea will target the app developer community, which has no choice when it comes to how a platform is configured. Once they’ve targeted something like Android, then they have to live within the strictures that it imposes.

Movea will also be making more demo apps available – at the block level. The lower-level code that the tool generates or works with isn’t open.

So that’s a broad-brush look at what is now the broad concept of the SmartFusion Studio. We should be seeing more on this later this year as specific product configurations are announced.

 

 

More info:

Movea SmartFusion Studio

One thought on “EDA for Sensor Fusion”

  1. Movea is trying to bring to the market a tool for assembling motion and other algorithms. How do you do this day without such tools? Or do you?

Leave a Reply

featured blogs
Dec 19, 2024
Explore Concurrent Multiprotocol and examine the distinctions between CMP single channel, CMP with concurrent listening, and CMP with BLE Dynamic Multiprotocol....
Dec 24, 2024
Going to the supermarket? If so, you need to watch this video on 'Why the Other Line is Likely to Move Faster' (a.k.a. 'Queuing Theory for the Holiday Season')....

Libby's Lab

Libby's Lab - Scopes Out Littelfuse's SRP1 Solid State Relays

Sponsored by Mouser Electronics and Littelfuse

In this episode of Libby's Lab, Libby and Demo investigate quiet, reliable SRP1 solid state relays from Littelfuse availavble on Mouser.com. These multi-purpose relays give engineers a reliable, high-endurance alternative to mechanical relays that provide silent operation and superior uptime.

Click here for more information about Littelfuse SRP1 High-Endurance Solid-State Relays

featured chalk talk

ROHM’s 3rd Gen 650V IGBT for a Wide range of Applications: RGW and RGWS Series
In this episode of Chalk Talk, Amelia Dalton and Heath Ogurisu from ROHM Semiconductor investigate the benefits of ROHM Semiconductor’s RGW and RGWS Series of IGBTs. They explore how the soft switching of these hybrid IGBTs contribute to energy savings and power generation efficiency and why these IGBTs provide a well-balanced solution for switching and cost.
Jun 5, 2024
33,780 views