feature article
Subscribe Now

Are Holographic Displays the Ultimate UI for AI?

1947 was an interesting year. Not that I was there myself, you understand, but I’ve heard stories. For example, 1947 was the year the Hungarian-British electrical engineer and physicist Dennis Gabor invented holography. 1947 was also the year William Shockley, John Bardeen, and Walter Brattain demonstrated the first transistor at Bell Labs. Also in 1947, Alan Turing gave what the Encyclopedia Britannica describes as “Quite possibly the earliest public lecture to mention computer intelligence.” Now, a mere 77 years later, these three fields are coming together in awesome ways.

I’ve said it before, and (doubtless) I’ll say it again, I think the combination of artificial intelligence (AI) and mixed reality (MR) is going to change the way we interface with our systems, the world, and each other. Just to ensure we’re all tapdancing to the same skirl of the bagpipes, MR encompasses augmented reality (AR), diminished reality (DR), virtual reality (VR), and augmented virtuality (AV). Some people also use the term extended reality (XR), but I equate this to MR, and don’t even get me talking about hyper reality (HR) (see also What the FAQ are VR, MR, AR, DR, AV, and HR?). 

One of the key requirements for the AI+MR/XR combo is some form of display. This is where the guys and gals from Swave Photonics leap onto the center stage with a ballyhoo of bimbonifonos (my ears are still ringing). Just a little over two years ago as I pen these words, Swave started life with a mission to take the Holographic eXtended Reality (HXR) technology developed by imec and use it to bring the metaverse to life. As they trumpeted, tromboned, and bimbonifoned at that time:

Our HXR technology is the Holy Grail of the metaverse, delivering lifelike, high-resolution 3D images that are viewable with the naked eye, with no compromises. HXR technology enables 1000x better pixel resolution with billions of tiny, densely packed pixels to enable true realistic 20/20 vision without requiring viewers to wear smart AR/VR headsets or prescription glasses.

Swave’s HXR technology projects lifelike holographic images that eliminate today’s AR/VR/XR challenges of focal depth and eye tracking, so viewers can easily focus on nearby and faraway objects. Most importantly, the HXR chips are manufactured using standard CMOS technology, which enables cost-effective scaling. Leveraging advances in photonics and holography based on diffractive optics, Swave’s HXR gigapixel technology targets metaverse platforms, 360-degree holographic walls, 3D gaming, AR/VR/XR glasses, collaborative video conferencing, and heads-up displays for automotive and aerospace systems.

Swave technology can also power holographic headsets that deliver immersive 3D AR/VR/XR experiences with stunning high resolution, perfect depth of focus and 180-degree to 360-degree viewing angles, all without the headaches experienced by users of conventional headsets. Applications powered by HXR gigapixel technology will be capable of passing the visual Turing test in which virtual reality is practically indistinguishable from real-world images that humans see with their own eyes.

Read that last sentence again. The one that says: “Applications powered by HXR gigapixel technology will be capable of passing the visual Turing test in which virtual reality is practically indistinguishable from real-world images that humans see with their own eyes.”

Hold that thought in your mind while we take a brief digression. I was at the front of the queue when the Oculus Rift VR headset hit the streets in 2016. On the one hand, that’s only 8 years ago. On the other hand, it seems to be a lifetime away. Now I have a Meta Quest 3 MR headset. This is great, and it’s easy to become captivated and to suspend your disbelief. For example, watch this video of people trying Richie’s Plank Experience

If you really want to push this to the next level, lay an 8” wide, 1” thick, 10-foot-long plank on the floor and balance on this while “enjoying” the experience.

The point is that we are used to trusting our eyes. I once used mine to read that 80% of our sensory perception of the world is visual (I don’t know who determined this or how they came up with this number). As a result, even though the graphics associated with today’s VR headsets really aren’t that great, taking a stroll on Richie’s Plank can certainly set your heart racing. Now, imagine the same experience while wearing a headset capable of passing the visual Turing test…

The reason I’m waffling on about this here is that I was just chatting with Mike Noonen, who is CEO at Swave. My head is still buzzing, and my brain is still wobbling on its gimbals at the things I heard and saw. “What sort of monster-size headset is this going to take?” I asked. Well, Mike reached for a headset to show me, but he mixed it up with his own glasses. After we’d sorted that out, he presented a pair of glasses looking much like the ones I’m wearing now, but with a few extra bits and pieces attached to the insides of the frames.

 

(Source: Swave)

As we see, we have RGB lasers “firing backwards” to illuminate Swave’s HXR spatial light modulator (SLM), where an SLM is a device that can control the intensity, phase, or polarization of light in a spatially varying manner. This HXR SLM is the device that’s taping out as we speak.

The HXR SLM projects the sculpted waveform onto a free-form mirror, which projects the wavefront onto a semi-transparent mirror on the lens of the glasses. In turn, this mirror projects the image onto the user’s pupil.

Mike says that, in order to do holography correctly and to obtain the enormous field of view we require, we need the pixel pitch to be at 0.5 lambda (i.e., half the wavelength of the light regime being used). Digital light processing (DLP) technology currently has a pixel pitch of 5.4µm, which equates to 10.8 lambda, while liquid crystal on silicon (LCoS) has a pixel pitch of 3µm, which equates to 6 lambda. Even state-of-the-art micro-LED (µLED) is hovering around 2.5µm. Although these are all awesome technologies, they fall flat when it comes to holograms. Why? Well, according to Mike, “When you have a pixel pitch that’s multiples of the wavelength of light you are working with, you end up with a field of view that’s kind of like looking through a keyhole.”

By comparison, Swave’s HXR pixel pitch is <300nm. This equates to the magical 0.5 lambda, which means they kick-off with 160 degrees field of view right from the get-go. Another way to look at this is that Swave’s HXR SLM packs 16 megapixels per square millimeter, which gives them a quarter of a billion pixels per eye! As Mike says, “This gives us an embarrassingly large number of pixels—the world’s highest resolution display by far, albeit for holography—all packed into a little CMOS chip with CMOS economics.” Speaking of which, all of this is being taped out on 300mm wafers implemented using a mature process node, which equates to lower cost and lower risk.

(Source: Swave)

Now, when someone says “holographic displays” I immediately think of Princess Leia’s message (but preferably in full color, at a higher resolution, and with fewer visual artifacts and less noise).

But this started me worrying. I’m less enthused about having a pair of lightweight MR glasses if I’m also compelled to carry a 40-pound backpack loaded with graphics processing units (GPUs) to generate the images and car batteries to power everything.

Mike set my fears to rest. Although full 3D animations and photo-realistic imagery may come at some time in the future, what we are talking about in the closer-term is the ability to take 2D bitmaps and project them as holograms into 3D space.

(Source: Swave)

As just one example, Mike says that anything that’s on your tablet’s desktop, or your smartphone’s screen, or the face of your smartwatch can be displayed on these glasses using Bluetooth-level bandwidths.

This opens the door to an incredible wealth of applications, including providing the ideal UI for AI. When I’m walking around a strange city, for example, I’ve long wanted to be able to see arrows appearing in front of me guiding me to where I want to go. Now, using the GPS and maps capabilities in my smartphone coupled with glasses equipped with Swave technology, I can see this coming to fruition.

Will this be in my lifetime? You can bet your little cotton socks that it will! In fact, Swave is currently taking orders for HXR development kits for device manufacturers, which will reach customers in the second half of 2024. These development kits will provide the hardware and software for companies to design, prototype, and test their own XR hardware and form factors using the Swave chipset.

I, for one, am bouncing off the walls with excitement. How about you? Do you have any thoughts you’d care to share about any of this?

4 thoughts on “Are Holographic Displays the Ultimate UI for AI?”

Leave a Reply

featured blogs
Nov 12, 2024
The release of Matter 1.4 brings feature updates like long idle time, Matter-certified HRAP devices, improved ecosystem support, and new Matter device types....
Nov 13, 2024
Implementing the classic 'hand coming out of bowl' when you can see there's no one under the table is very tempting'¦...

featured video

Introducing FPGAi – Innovations Unlocked by AI-enabled FPGAs

Sponsored by Intel

Altera Innovators Day presentation by Ilya Ganusov showing the advantages of FPGAs for implementing AI-based Systems. See additional videos on AI and other Altera Innovators Day in Altera’s YouTube channel playlists.

Learn more about FPGAs for Artificial Intelligence here

featured paper

Quantized Neural Networks for FPGA Inference

Sponsored by Intel

Implementing a low precision network in FPGA hardware for efficient inferencing provides numerous advantages when it comes to meeting demanding specifications. The increased flexibility allows optimization of throughput, overall power consumption, resource usage, device size, TOPs/watt, and deterministic latency. These are important benefits where scaling and efficiency are inherent requirements of the application.

Click to read more

featured chalk talk

Tungsten 700/510 SMARC SOMs with Wi-Fi 6 / BLE
Sponsored by Mouser Electronics and Ezurio
In this episode of Chalk Talk, Pejman Kalkhorar from Ezurio and Amelia Dalton explore the biggest challenges for medical and industrial embedded designs. They also investigate the benefits that Ezurio’s Tungsten700 and 510 SOMs bring to these kinds of designs and how you can get started using them in your next design.
Nov 7, 2024
8,001 views