feature article
Subscribe Now

SiLC’s FMCW LiDAR Can Perceive and Identify Objects More Than a Kilometer Away

I’ve told this tale before, and I’ll doubtless tell it again because I think it’s worth the retelling. When I was a kid (think Charlie Brown), my mom had a fulltime job (which was unusual for ladies at that time), so I spent the halcyon days of summer when we were out of school up the road at my Auntie Barbara’s house hanging out with my cousin Gillian (think Lucy because she was mean LOL).

As an aside, one day whilst getting into our usual mischief in the backyard, Gillian and I saw a big, bloated spider enter one end of a coiled hosepipe (which is what they call a “garden hose” in England). We sat there for hours without saying a word or moving a muscle waiting for it to emerge from the other end (it never did). Maybe it was hiding just inside, keeping one of its eight beady eyes on us waiting for us to go away. It’s funny the memories that pop into your head after almost 60 years, but we digress…

Sometime during the morning, my aunt would call us in, lift us up and set us on the kitchen counter, and scrub our hands, elbows, knees, and faces—she was good at scrubbing and we had the reddest knees and elbows in the district—after which we would set off on our daily trek to the shops at the bottom of the road.

In addition to a newsagent, pharmacy, post office, florist, hairdresser, and ironmonger, there was a bread shop, a greengrocer, a butcher, and a fishmonger (I always closed my eyes when we passed the fishmonger’s because I was a sensitive child and I was pretty sure I didn’t want to see anyone monging a fish).

On the way down the road to the shops, and on the way back up the road afterwards, we would invariably run into other adults who were oftentimes accompanied by their own kids. The adults would always feel the need to stop and chat. At that time, I thought the adults were as old as the hills but—looking back now—they were probably in their mid-30s.

As I recall, it seemed they had only three topics of conversation. The first was the weather—how it was today, how this compared to the past 100 years, and what we could expect for the next 50 years. The second was health in general—who was currently sick, who had recently been sick, and who was probably going to get sick if they weren’t careful—with a particular focus and relish on the topic of operations (I counted it as a good day if they didn’t start showing each other their scars). The third was how time seemed to go faster as you got older.

Most of the time when the adults were talking, I heard it only as plaintive hooting taking place way above my head. If I occasionally took the time to tune into what they were saying, I always thought something along the lines of, “Good Grief, can’t you think of anything else to talk about?” The funny thing is that, now I’m an older and sadder man, I find myself doing the same thing, endlessly waffling on about how fast time seems to go the older you get. For example…

…it seems like only a couple of weeks ago that I penned my column Equipping Machines with Extrasensory Perception. In reality, more than 8 months have slipped past my nose. What can I say? Time really does seem to go faster as you get older.

In my earlier column, we discovered how the folks at SiLC Technologies are creating what they claim to be the most compact frequency modulated continuous wave (FMCW) LiDAR ever made. While trying (unsuccessfully) to be modest, they also say that their Eyeonic vision sensor technology is multiple generations ahead of any of their competitors with respect to the level of optical integration it deploys.

The Eyeonic vision sensor is now available to strategic partners
(Image source: SiLC)

Just to remind ourselves, as I said in my original column: “The LiDAR systems that are currently used in automotive and robotic applications are predominantly based on a time-of-flight (TOF) approach in which they generate powerful pulses of light and measure the round-trip time of any reflections. These systems use massive amounts of power—say 120 watts—and the only reason they are considered to be “eye-safe” is that the pulses they generate are so short. Also, when they see a pulse of reflected light, they don’t know if it’s their pulse or someone else’s. And, just to increase the fun and frivolity, since light travels approximately 1 foot per nanosecond, every nanosecond you are off may equate to being a foot closer to a potential problem.”

By comparison, the FMCW-based Eyeonic sends out a continuous laser beam at a much lower intensity than its pulsed TOF cousins, and it uses a local oscillator to mix any reflected light with the light generated by its coherent laser transmitter. By means of some extremely clever digital signal processing (DSP), it’s possible to extract instantaneous depth, velocity, and polarization-dependent intensity on a per-pixel basis while remaining fully immune to any environmental and multi-user interference. Furthermore, the combination of polarization and wavelength information facilitates surface analysis and material identification.

I was just chatting with Ralf Muenster, who is VP, Business Development and Marketing at SiLC. Ralf reminded me that prior to this year’s CES, which took place 5-7 Jan 2022, the guys and gals at SiLC had demonstrated a detection range of more than 500 meters, which is not shabby at all.

Well, the chaps and chapesses at SiLC have been refining and optimizing their technology and—as a result and as demonstrated in this video—they have now demonstrated the ability to perceive, identify, and avoid objects at a range of more than 1 kilometer. The folks at SiLC say this is a feat that no other company can claim.

 

As one final aside, when Ralf first told me about this, he actually said that they now had the ability to perceive, identify, and avoid objects at “a range of more than 1,000 meters.” For some reason, this didn’t really grab my attention as much as it should have. It was only when I translated “more than 1,000 meters” to “more than 1 kilometer” in my noggin that I thought, “Hang on, a kilometer is a looooooong way.” I obviously know that 1,000 meters and 1 kilometer are the same thing (I’m an engineer, I went to university, this is one of the things they taught us), so why is it that “1 kilometer” seems so much meatier than “1,000 meters” (am I alone here, or do you think of things the same way)?

Ultra-long-range visibility is a requirement in many industries that utilize machine vision, including automotive, metrology, construction, drones, and more. Specific scenarios include providing enough time for a vehicle to evade an obstacle at highway speeds, enabling a drone to avoid others in the sky, and controlling deforestation by making precision mapping and surveying of forests possible.

Next-generation vision sensors that incorporate millimeter-level accuracy, depth, and instantaneous velocity are key to true autonomous driving and other machine vision applications, and SiLC’s FMCW LiDAR looks like being the optimal technology to make this vision (no pun intended) a reality. What say you? What do you think about all of this?

Leave a Reply

featured blogs
Nov 12, 2024
The release of Matter 1.4 brings feature updates like long idle time, Matter-certified HRAP devices, improved ecosystem support, and new Matter device types....
Nov 7, 2024
I don't know about you, but I would LOVE to build one of those rock, paper, scissors-playing robots....

featured video

Introducing FPGAi – Innovations Unlocked by AI-enabled FPGAs

Sponsored by Intel

Altera Innovators Day presentation by Ilya Ganusov showing the advantages of FPGAs for implementing AI-based Systems. See additional videos on AI and other Altera Innovators Day in Altera’s YouTube channel playlists.

Learn more about FPGAs for Artificial Intelligence here

featured paper

Quantized Neural Networks for FPGA Inference

Sponsored by Intel

Implementing a low precision network in FPGA hardware for efficient inferencing provides numerous advantages when it comes to meeting demanding specifications. The increased flexibility allows optimization of throughput, overall power consumption, resource usage, device size, TOPs/watt, and deterministic latency. These are important benefits where scaling and efficiency are inherent requirements of the application.

Click to read more

featured chalk talk

Machine Learning on the Edge
Sponsored by Mouser Electronics and Infineon
Edge machine learning is a great way to allow embedded devices to run applications that can collect sensor data and locally process that data. In this episode of Chalk Talk, Amelia Dalton and Clark Jarvis from Infineon explore how the IMAGIMOB Studio, ModusToolbox™ Software, and PSoC and AURIX™ microcontrollers can help you develop a custom machine learning on the edge application from scratch. They also investigate how the IMAGIMOB Studio can help you easily develop and deploy AI/ML models and the benefits that the PSoC™ 6 Artificial Intelligence Evaluation Kit will bring to your next machine learning on the edge application design process.
Aug 12, 2024
56,176 views