feature article
Subscribe Now

Smart Sensor Sees in 3D

Toposens Creates Ultrasound Sensor for Robots, Cars, and Robot Cars

“[Motion sensing hand dryers] never work, so I just end up looking like I’m waving hello to a wall robot.” – Jimmy Fallon

In school I was taught that a thermostat is a robot, inasmuch as it responds to its surroundings in real time. Years later, I was the head of engineering for a robot company that made big, scary machines with 500-pound arms. Those seemed like “real” robots to me. Now we can buy robots to vacuum under our furniture, look after our garden sprinklers, patrol grocery aisles, or just to play around with.

What all robots have in common is a need to perceive and respond to their environment. Without that sensory aspect, my teacher said, all you’ve got is a machine. Robots must be able to change their behavior based on external – and unexpected – inputs. That closed-loop feedback was the defining characteristic of a robot, in his view. 

“’Easy to use’ is easy to say,” as the GUI experts tell us. It’s one thing to throw sensors at a product and say it’s autonomous, but that understates the complexity of the problem. What are you sensing, exactly? How accurately? How quickly? How effectively? And, what do you do with the raw data the sensors produce? 

A German startup has tackled all those problems head-on and come up with a $100 device that should make robot-building a bit less daunting. 

Toposens is a 22-person company based in the leafy and gemütlich Bavarian capital of Munich. The founders didn’t start out to make sensors. Instead, they were trying to build a robotic fish. As you do. 

The problem with the fish (apart from waterproofing everything) was sensing its surroundings in three dimensions, underwater, in real time. Water can be a tough medium to work with because it distorts light and sound a lot more than air does. And fish can move in all directions. To make sense of the (ahem) streams of data flowing in from its sensors, the founders developed a unique algorithm to crunch the raw data. That took four years of effort but solved a tough problem of sensing in 3D. So much so, that they abandoned the fish idea and instead developed a new 3D ultrasound sensor based around their sensor-fusion algorithm. 

The result is TS3, a sensor module about the size of a pack of gum, with a UART port on one side. What makes TS3 different from other ultrasound sensors is that it works in three dimensions simultaneously. It has a single output transducer, but three receivers. More importantly, it also has built-in hardware to implement Toposens’s secret algorithm that combines the data from all three receivers into one unified image of the world outside. 

Typical ultrasound sensors are essentially one-dimensional, sending out a 40-KHz pulse and measuring the time it takes for the sound to reflect back to a receiver. This gives a “flat” view of the world, like looking through one eye. Such 1D sensors allow a home vacuum robot to avoid obstacles, yet they somehow still get stuck under chairs, eat power cords, and drive over discarded socks. 

The TS3, on the other hand, sends a single pulse but receives and integrates multiple reflections, forming a 3D image of the world ahead. It can deduce the sides, edges, and angles of objects, their height, and what empty space might lie between them. Rounded objects like table legs (or human legs) have shape and depth in the Toposens world. 

Why not just install three traditional ultrasound sensors? Won’t that accomplish the same thing? Not a chance, says Toposens Managing Director Tobias Bahnemann. Individual sensors can’t integrate reflected data the way their single 3D sensor can. It’s hard enough to correlate the reflections from a single source reflecting off multiple objects at arbitrary angles. There’s no way to combine three separate sources and three separate receivers. The time-of-flight delta is extremely small for objects that are only a few inches away. That’s what their custom processor and magic algorithm are designed to fix. 

Like other ultrasound and lidar sensors, the output from the TS3 is a serial data stream delivering a “point cloud,” a table of X, Y, Z, and “loudness” coordinates. This last datum measures the strength of the reflected ultrasound wave and correlates to the material density, or hardness, of the object. Shiny metal objects will reflect more strongly than squishy ones, and developers can use this data to distinguish between, say, table legs and human legs. Or between a stuffed animal and the real thing. 

TS3’s point cloud can’t identify what an object is, only where it is. That capability might come later, says Bahnemann. Current users in the automotive and robotics arena don’t need object recognition, but future users might want it to fine-tune their path planning or collision avoidance. Toposens might also start providing velocity information, rather than make customers derive that from successive point clouds. 

Ultrasound sensors have both advantages and disadvantages compared to other technologies, such as lidar (lasers) and cameras. Ultrasound can see transparent objects, for one. This turns out to be a big deal in hotels, where cleaning robots with light-based sensors bump into glass doors and windows. Not a good look for a high-class establishment. Ultrasound also works in the rain, snow, and total darkness, which can stymie cameras. 

On the other hand, cameras generally produce a more detailed image (i.e., they have more pixels) than ultrasound, and they can detect colors, although that’s rarely important. Camera images are just as “flat” as 1D ultrasound images, requiring two or more cameras to produce a stereoscopic image. Lasers and cameras work at longer ranges than ultrasound; TS3 is effective to about 5 meters. 

Toposens currently sells its TS3 to developers for about $250 a pop but promises volume customers sub-$100 pricing. Tire-kickers include local favorites BMW, Porsche, and Daimler-Benz, as well as Huawei and other electronics firms. The need for sensors is undeniable, but the choices are many. If Toposens, like a robot, has responded appropriately to market demand, the market may respond favorably in kind. 

Leave a Reply

featured blogs
May 2, 2025
I can safely say that I've never seen a wheeled-legged robot that can handle rugged terrains, muddy wetlands, and debris-strewn ruins like this...

featured paper

How Google and Intel use Calibre DesignEnhancer to reduce IR drop and improve reliability

Sponsored by Siemens Digital Industries Software

Through real-world examples from Intel and Google, we highlight how Calibre’s DesignEnhancer maximizes layout modifications while ensuring DRC compliance.

Click here for more information

featured chalk talk

Using NXP’s FRDM Ecosystem to Break Down ML Complexity
In this episode of Chalk Talk, Michael Pontikes from NXP and Amelia Dalton explore the details of the FRDM ecosystem from NXP. They explore the scalability component of this ecosystem, the details of the FRDM i.MX 93 Development Board and how the machine learning software and tools of this ecosystem will streamline and simplify your next machine learning enhanced design.
Apr 17, 2025
16,615 views