I just had an interesting video conference call. It started off as so many things do… with people laughing at me (I live for the day when people laugh *with* me rather than *at* me). First, there was the fact that my name invariably comes up as “Max the Magnificent” in conference applications like Zoom and Teams. I can no longer recall how this came to be, and I have no idea how to change it, so there we are.
On the other hand, now I come to think about it, why should I change it? “If you’ve got it, flaunt it,” as the old saying goes (or “When you got it, flaunt it,” as Mel Brooks famously said in the 1967 movie version of The Producers).
Once they’ve stopped chortling to themselves, my conference companions commonly convert their consideration to the pictures on the wall behind me. This is a tad embarrassing, in a way, because most of these little scamps are of myself, but that’s mainly because my wife (Gina the Gorgeous) says that they are much too good to be wasted on our walls at home. This would be all the more flattering (and believable) if she didn’t speak these words between gritted teeth but—hey—I’ll take whatever words of encouragement I can get.
Poised to power-up a project (Source: Max Maxfield)
But we digress… My video chat was with Ruby Yan, who is the Business Unit Director of the Human-Machine Interface Product Line at GlobalFoundries (GF). Many people who are not “in the business” remain blissfully unaware of the fact that, as of 2023, GlobalFoundries is the world’s third-largest semiconductor foundry by revenue. Also, it is the only one with operations in Singapore (with both 200mm and 300mm wafer fabrication plants), the European Union (with a 300mm plant in Dresden, Germany), and the United States (with a 200mm plant in Essex Junction, Vermont, and a 300mm plant in Malta, New York). Furthermore, GlobalFoundries is a “Trusted Foundry” for the US federal government, and enjoys similar designations in Singapore and Germany, including certified international Common Criteria standard (ISO 15408, CC Version 3.1).
The purpose of our call was for Ruby to bring me up to date with the latest and greatest happenings with respect to GF’s FDX FD-SOI process and platform.
As an aside, as soon as the FD-SOI (which stands for “fully depleted silicon-on-insulator”) moniker made its presence felt, I remembered the column I wrote about Lattice Semiconductor using an FD-SOI process for their latest and greatest FPGAs—see Handling Radiation in SRAM-Based FPGAs. To be honest, I have no idea if the guys and gals at Lattice use the chaps and chapesses at GF to build their FPGAs, but the underlying process-related concepts are the same.
Some of the aspects of the FD-SOI process that stuck in my mind are (a) by varying the biasing of the substrate, users can decide whether they wish to run for high performance (HP) or low power (LP), and (b) this process is inherently resilient to radiation effects like single event upsets (SEUs), multiple cell upsets (MCUs), and multiple bit upsets (MBUs). But, once again, we digress…
Returning to GF’s FDX FD-SOI process and platform, in addition to offering high-performance and ultra-low power, this platform offers full System-on-Chip (SoC) integration, including digital, analog, and high-performance radio frequency (RF) functionality.
Ruby told me that, having developed and deployed the core FDX FD-SOI process, they started to see use cases picking up for this type of high-performance ultra-low-power offering. These use cases were coming from widely disparate sources, including automotive, security monitoring, and medical applications.
One application area that started to see a lot of traction was that involving ultra-low-power ultra-wideband (UWB) radar, where UWB is a radio technology that can use a very low energy level for short-range, high-bandwidth communications over a large portion of the radio spectrum. Typical applications include sensor data collection, precision locating, and tracking.
In the case of the automotive arena, for example, it seems increasingly common to hear horror stories in the news of adults inadvertently leaving infants or pets in cars on blazingly hot days. In many cases, the reason we hear about this is that the unfortunate kids or pets don’t survive the experience. Oftentimes, this occurs because the adult was either unaware that another living thing was present, or they simply forgot. It’s easy to cast aspersions (well, not for me because my throwing arm isn’t what it used to be), but it’s also possible to sympathize with someone working multiple jobs who lost track of the fact that they had agreed to drop a child off at school or a pet off at the vet. I often end up cruising along on auto-pilot—and I’m not talking about the one in my car—only to surprise myself by arriving at an unintended destination.
I cannot even imagine the horror experienced by a parent or guardian who inadvertently kills a kid in this way. This is something that would haunt you for the rest of your life.
The thing about ultra-low-power UWB is that a sensor can be used to detect the presence (and number) of people in a car—even if they are asleep or motionless—by detecting their breathing and the beating of their hearts. This technology also has medical applications like monitoring patients in hospitals or tracking one’s sleep patterns at home.
Things are heating up (no pun intended), because new regulations are coming into effect that will require child and pet (and grandparent) detection in all new vehicles.
To ensure they maintain their leadership position in this area, the folks at GF have been collaborating with multiple sensor companies and partnering with NOVELDA to fine-tune their FDX FD-SOI process and platform and push the boundaries of what’s possible with UWB radar technology…
… … … …
…I’m sorry. I lost track of time there for a moment. I fear I was sucked into NOVELDA’s website, tempted by their teasing graphics to click the “Learn More” links associated with their CPD & Vital Signs Sensor, Ultra-Low-Power Presence Sensor, Proximity Sensor, and Occupancy Sensor offerings.
I can envisage a time in the not-so-distant future when we wend our way through a world of ubiquitous sensors, supported by sophisticated artificial intelligences (AIs) that can accurately interpret what they “see” and use this knowledge to make our lives easier and safer. What say you? Do you have any thoughts you’d care to share on any of this?