feature article
Subscribe Now

Yet More Sensors

Some Unusual Bits

Yes, more sensors. We covered new sensor developments earlier this year, based on papers from the ISSCC conference. But you know what? People didn’t stop developing sensors after that. I know –  amazing, right? And, yes, as we’ve noted before, many announcements now aren’t about revolutionary new approaches, but rather about faster speed, lower power, and/or a smaller footprint.

Not to say that such announcements aren’t important – that extra board space you save just might mean that you can do that new application that wasn’t possible before. But, from a story standpoint, there’s not so much to say that a press release hasn’t already said.

But there are still other announcements happening that bring with them some slightly new angle or perspective on a topic – something quirky to talk about other than (or in addition to) faster, leaner, smaller. And I’ve been collecting them up over the last few months.

So this is a round-up of five new sensors. None is more important than the other, so I’ll present them in order of company name (or original company name – that will make more sense in a minute). Of course, that doesn’t help with our first two, which happen to be from different corners of AMS. So we’ll take those in order of announcement.

Make Mine a Double

Our first sensor is a magnetic position/angle sensor: the AS5270A/B from AMS. Unlike your run-of-the-mill version, this one is aimed for the top safety level in automotive (and other) applications. You can’t get that by just doing what everyone normally does.

Instead, they doubled the solution. Each single-sensor system is a single die with all the sensor and processing hardware included. They then stack two of them (galvanically isolated); each has its own pinout. Yeah, that last bit is a little surprising. The idea is that, if there’s an electrical fault on signals that are shared between the two sensor dice, then the electrical fault gets presented to both, and there’s no possibility of fail-over. Hence the option of separate pins.

So how, then, is fail-over accomplished? It’s not like the system can quickly re-solder the signals if something goes wrong. This is the role of the system controller. Both sensor dice are connected and running; if a fault is detected on one, then the system controller can do whatever it needs to do to bring the overall system into a safe state.

They gave me some examples of ways to hook these up:

  • Six wires: here both dice have separate inputs, outputs, and power supplies for a total of six wires. They give, as an application example, an accelerator pedal: it has to work given the possible consequences of an accelerator fail.
  • Four wires: here inputs and power supplies are shared between the two dice; outputs are separate. They see a throttle sensor as a possible application for this configuration.

The block diagrams are below for both the analog-output and digital-output versions.

(Click to enlarge; image courtesy AMS)

Show Me Your True Colors

Yes, color sensors have been around for a long time. What’s different here involves two aspects of a color space. The first aspect is the bit depth used to represent colors. Given that our color models are integer in nature, while representing a continuous phenomenon, there are going to be limits and quantization errors and other fussy artifacts that establish the nature of a particular range of expressible colors, known as a gamut. Our focus today is “True Color,” using 24 bits each for color channels plus an additional “alpha” channel for encoding transparency.

The second aspect is subtler: it gets to how colors are encoded within that gamut. We’re used to RGB, but that model has limitations. And this is where color theory stuff gets fuzzy, at least to me.

Because, on one hand, color is simply a light frequency. You’ve got the rainbow spanning lower-than-red to higher-than-violet. With lots of colors in between. So you’d think that detecting color should be about measuring a wavelength.

But there’s a hole in that view. Ever notice, for instance, that there’s no brown in the rainbow? And yet brown is a color as far as we humans are concerned. That means that color mixing must now be considered – the summing of more than one wavelength.

For us non-light-specialists, things get troublesome from another perspective. We all (hopefully) get to experience working with color at some level when we’re kids. We play with crayons, paint, and other sources of color. And we all know that, if you mix certain colors, you get other colors. Which is odd, since… yellow and blue make green, but green has its own separate place in the rainbow. How do you mix two wavelengths and synthesize a third one?

Never mind; it clearly works. And the three primary colors are red, yellow, and blue. Mix any two and get variants on most other colors. But you’ll need to mix all three to get any variant of brown.

It doesn’t stop there. Once we enter the world of technology, we see that monitors (and, indeed, even LED light sources) also have primary colors for mixing to get new colors. Problem is, these primary colors are red, blue, and green. Totally different mixing recipes from dyes. What’s up with that?

Much of this is way above my pay grade, but it seems to me to involve a couple of different notions. One is whether or not you’re analyzing or working with transmitted or reflected light; not going to delve into that one here. The other is that, ultimately, as light reaches our eyes, it’s our biological receptors – and how their signals are interpreted by our brains – that represent how we experience color.

Yes, much of color is wavelength. You’d think we’d leave it at that. But evolution has apparently found value in a more complex system where not all colors are equal. For instance, we perceive green more brightly than blue or red. There’s no engineering logic to that; it’s a function of how the rods and cones in our retinas work. So we can lament a certain lack of orthogonality between colors, or we can try to make our gadgets see the world as closely as possible to how we humans see it.

And… wow, that was a really long way of getting to this second aspect of color representation: the XYZ model. Which is not a new thing: its official name is the CIE 1931 Color Space. Cuz it happened in 1931. So what have we been doing since then?

RGB is certainly a nice, clean model from an engineering standpoint, and it does a reasonably good job for a lot of things. XYZ is a “messier” (if more realistic) model – and we can probably thank biology for that messiness, since rods and cones evolved rather than being designed outright. That said, apparently, early limited attempts at XYZ sensors have created an impression of “hard to do,” “expensive,” and, “Oh come on! RGB is JUST FINE. Sheesh!”

Which is where we finally get to our story. AMS has announced the TCS3430: an XYZ True Color sensor – one with a small footprint suitable for phones and such. They see these approaching the price range of more mature color-sensing technologies, at least as compared to earlier renditions. They credit interference technology for this benefit.

Remember back in your early semiconductor training days? There was a time when you could measure the thickness of an oxide layer by looking at its color. (Yes, the colors repeat, so you had to start with a known range. But that’s not the point.)

It’s that same thing: building thin, precise layers of materials – possibly SiO2, possibly others – and using them to detect the basic colors. This same technique also powers hyperspectral sensors. Implemented in a CMOS-like process, they say this brings the cost down and makes it far more stable over its lifetime than organic-based approaches (which may need recalibration). They also use a “dark channel” that serves as a reference point for noise and offsets and such.

(Click to enlarge; image courtesy AMS)

Step Lively, Now!

Meanwhile, Bosch is continuing its pattern of creating highly integrated sensor applications rather than simply sensors. We’ve seen it before; now we have one going in a new direction: step counters for wearables: the BMA456 and BMA423 (the former being higher performance and accuracy).

The idea is based on the fact that using sensors in a system usually means using a microcontroller along with them. Not so with these devices; Bosch integrates the microcontroller into the chip, and you no longer have to figure out step count; it tells you. You can still get access to the sensor data, but you don’t need to if it’s already feeding you the end result that you want.

While the primary “engine” is based on their accelerometer technology, they allow an external magnetic sensor connection as well. The mag data can be fused with the raw accelerometer data. It’s not clear how this would help step counting, but, then again, it’s not likely that a wearable device will be a one-trick pony doing only step counting. So one obvious use would be to leverage the magnetometer in a compass application, using the accelerometer for tilt compensation.

It also has power-saving features that let it sleep while still maintaining a count. Power, of course, being one of the critical parameters in any wearable device.

(Click to enlarge; image courtesy Bosch)

Soil Ain’t Green

Rohm’s Lapis branch has announced a soil sensor, providing pH, soil temperature, and electrical conductivity data (which reveals moisture). Such measurements can be done today, but, according to Lapis, it involves a manually wielded device from which data has to be sent out for analysis. They say those results can take a couple of weeks to come back. Not great if you’re trying to decide whether you need to water. (Then again, if you’re actually out there taking measurements, you probably don’t need the gadget for water, but for decisions on other inputs.)

In this case, the sensors can be implanted in an array, sending data wirelessly up to a kilometer away. Yes, useful on a farm. But they also point to other applications, like detecting a landslide and warning off people in the possible path.

This sensor is different from the others in that it’s a complete system that could be directly used by a consumer. It will enter the market next year, but Rohm has plans to sell it only in Japan for the time being. Actual product details (name, specs) are not yet available.

(Click to enlarge; Image courtesy Rohm)

Under Pressure

We end with news from MEMSIC. No, wait. The press release came from MEMSIC. But then they apparently divested a bunch of stuff into a new company called Aceinna (pronounced “Ah SEE nah”). So you’ll no longer find the MDP200 differential pressure sensor they announced on MEMSIC’s website; you’ll be rerouted to Aceinna.

We looked at MEMSIC’s unusual thermal accelerometer technology a few years back. Rather than the usual bulk proof masses, it uses a heated airmass that moves about – and that movement can be detected with great sensitivity.

Well, they’re now leveraging that same concept for a differential pressure sensor – that is, one that measures not absolute pressure, but the difference in pressure between two spaces. Which can translate into measuring flow – like airflow in a CPAP machine (for battling sleep apnea).

If the pressures are completely balanced, then the airmass straddles the heater. But with slightly higher pressure on one side, the airmass moves, and that movement can be sensed by thermopiles on each side.

(Image components courtesy Aceinna)

And that’s it for this edition. More to come in the future.

 

More info:

AMS AS5270A/B Magnetic Position sensors

AMS TCS3430 XYZ Tri-Stimulus True Color sensor

Bosch BMA456 and BMA423 wearable sensor

MEMSIC/Aceinna MDP200 pressure sensor

One thought on “Yet More Sensors”

Leave a Reply

featured blogs
Nov 15, 2024
Explore the benefits of Delta DFU (device firmware update), its impact on firmware update efficiency, and results from real ota updates in IoT devices....
Nov 13, 2024
Implementing the classic 'hand coming out of bowl' when you can see there's no one under the table is very tempting'¦...

featured video

Introducing FPGAi – Innovations Unlocked by AI-enabled FPGAs

Sponsored by Intel

Altera Innovators Day presentation by Ilya Ganusov showing the advantages of FPGAs for implementing AI-based Systems. See additional videos on AI and other Altera Innovators Day in Altera’s YouTube channel playlists.

Learn more about FPGAs for Artificial Intelligence here

featured paper

Quantized Neural Networks for FPGA Inference

Sponsored by Intel

Implementing a low precision network in FPGA hardware for efficient inferencing provides numerous advantages when it comes to meeting demanding specifications. The increased flexibility allows optimization of throughput, overall power consumption, resource usage, device size, TOPs/watt, and deterministic latency. These are important benefits where scaling and efficiency are inherent requirements of the application.

Click to read more

featured chalk talk

Machine Learning on the Edge
Sponsored by Mouser Electronics and Infineon
Edge machine learning is a great way to allow embedded devices to run applications that can collect sensor data and locally process that data. In this episode of Chalk Talk, Amelia Dalton and Clark Jarvis from Infineon explore how the IMAGIMOB Studio, ModusToolbox™ Software, and PSoC and AURIX™ microcontrollers can help you develop a custom machine learning on the edge application from scratch. They also investigate how the IMAGIMOB Studio can help you easily develop and deploy AI/ML models and the benefits that the PSoC™ 6 Artificial Intelligence Evaluation Kit will bring to your next machine learning on the edge application design process.
Aug 12, 2024
56,189 views