feature article
Subscribe Now

Did the Metaverse’s Holy Grail Just Arrive?

As is usually the case, my head is currently jam-packed full of thoughts about all sorts of stuff, including consciousness, pain, octopuses, robots, holograms, Monty Python, the metaverse, and whether or not the word “metaverse” should be capitalized (you should consider yourself lucky that you’re not trapped in here with me).

Let’s start with pain. I, for one, don’t like it. This reminds me of the old chestnut that goes: “Beat me,” begged the masochist. “No” replied the sadist (which also reminds me of another oldie but goodie that poses the question, “What’s the difference between ‘erotic’ and ‘kinky’?” The answer being ‘erotic’ is when you use a feather, while ‘kinky’ is when you use a duck).

In turn, this leads me to my recent discovery that every episode of Babylon 5 is available for live streaming on HBO Max. Bearing in mind that there were five seasons of Babylon 5, each with 22 episodes, that’s… a lot of episodes. Since this was made in the mid-1990s, the quality of the video and the sophistication of the effects leaves something to be desired, but the underlying backstory with the Vorlons and the Shadows stirs a certain something in me. Yesterday evening, I watched Season 2, Episode 21, “Comes the Inquisitor,” in which Sebastian—who, we eventually discover, is Jack the Ripper who was abducted by the Vorlons in 1888, since when has served as an interrogator—arrives to do what he does best, which (you may not be surprised to hear) involves dispensing more than a little pain.

I always think it’s fortunate that we can’t remember exactly what pain feels like. I know I don’t want to jab myself with a pin, but I can’t actually remember what the sensation was like the last time I did so.

There are, of course, many different aspects to the concept of pain. Yesterday morning, for example, I arrived at my office early because I had a long day of work planned, only to find all three of my computer monitors presenting black screens with the lead device displaying the ominous message, “Boot Media Not Found.” No matter what I did to my tower (or how hard I struck it with my emergency mallet), it powered up displaying the same grim words of doom and despondency.

Eventually, I called my chum Daniel Imsand who works at GigaParts. In turn, Daniel left voicemails for Nathan and Derek in the service department. I was standing outside the store with my tower under my arm as the doors opened, at which point I was whisked into the service area to take my rightful position as first in the queue (“It’s not what you know, it’s who you know,” as the old saying goes).

I have two solid-state drives (SSDs) in my tower – one is used to hold the operating system (OS) and all my programs, while the other is used to store my vast amounts of data. On the bright side, all of my important data files are backed up in the cloud, on my machine at home, and also on an external drive (“just in case”). On the downside, my OS disk had given up the ghost. Much like the Norwegian Blue in Monty Python’s Dead Parrot sketch, I found myself the proud owner of a drive that was “no more.”

The sterling chaps at GigaParts soon inserted a new 500GB drive, reloaded Windows 10 (my tower is a few years old and is not able to run Windows 11), and reinstalled Microsoft Office 365. Now I’m suffering the pain associated with downloading and installing all of my application software. Much like prodding yourself with a pin, you can’t really recall and appreciate the pain of recovering a computer until you are actually going through it, which is the situation I currently find myself in (“Ah, now I remember!”)

“But where do the topics of consciousness and octopuses come into the story,” I hear you cry. I’m glad you asked. It shows you’re paying attention. I’m currently in the process of reading Other Minds: The Octopus, the Sea, and the Deep Origins of Consciousness by Peter Godfrey-Smith, and it’s a real page-turner, let me tell you. As the author says: “The octopus is the closest we will come to meeting an intelligent alien. What can we learn from the encounter?” I know that I’m learning a lot and I don’t even have any octopuses to call my own.

Speaking of which, have you ever wondered whether you should use octopuses, octopodes, or octopi when discussing more than one octopus at a time (or isn’t this a problem you’ve run into)? Well, as you may recall from my Good-For-Nothing Grammarians column, one of my back-burner hobby projects is writing a future candidate for the New York Times Best Seller list called Wroting Inglish: The Essential Guide to Writing English for Anyone Who Doesn’t Want to be Thought a Dingbat. As one of the entries in the chapter on plurals I note the following:

“[…] consider the fact that the plural of octopus should be ‘octopuses’ or ‘octopodes,’ but you really shouldn’t use ‘octopi’ (even though a lot of folks do). Why? Well, the ‘-us’ changes to ‘-i’ in plurals only for second declension Latin nouns, but octopus comes from the Greek (octo, meaning “eight”, and pod, meaning “foot”; hence “eight-footed”), which means we can’t use the ‘-i’ form to generate the plural.

“Ah ha!” You are going to say to me. “If this is the case, then why is it OK to use hippopotami as the plural for hippopotamus?” Well, all I can say is that if you doubt my ability to wriggle out of this one… then you have very much misjudged your man! In fact, this is because the English word hippopotamus came to us from the folks who spoke Latin, but they got it from the folks who spoke Greek (where “hippopotamus” translates as “river horse” or “horse of the river”). Thus, by somewhat dubious logic (or, more realistically, dictated by common usage), it is acceptable to use both hippopotami and hippopotamuses as plurals depending on whether one considers the word hippopotamus to be derived from the Latin or the Greek, respectively. (Give me strength!)

And, speaking of grammar, we hear a lot of talk about the metaverse at the moment, but many of us wonder whether we should capitalize this as Metaverse in our writings.

In futurism and science fiction, the metaverse is a hypothetical iteration of the internet as a single, universal, and immersive virtual world that is facilitated by the use of virtual reality (VR) and augmented reality (AR) headsets. In colloquial use, by contrast, a metaverse is a network of 3D virtual worlds focused on social connection.

The term “metaverse” originated in the 1992 science fiction novel Snow Crash by Neal Stephenson as a portmanteau of “meta” and “universe.” Since Stephenson is the originator of the word, and since he used it 119 times in the book, capitalizing it each and every time, some purists believe that the capitalized Metaverse form should always be used irrespective of the word’s position at the start, in the middle, or at the end of a sentence. By comparison, cooler and calmer folks take the view that the word metaverse is not a proper noun and, as such, should be capitalized only at the beginning of a sentence or in a title.

I think we are getting there, because the only topics we have remaining to bring into our discussion are robots and holograms (although I can feel Monty Python struggling to make another appearance). Two of the books in my personal pantheon of “science fiction classics” are The Caves of Steel and The Naked Sun by Isaac Asimov. 

These tales are set about three thousand years in our future. By that time, hyperspace travel has been discovered and fifty planets known as “The Spacer Worlds” have been colonized.

In The Caves of Steel, we discover that the Earth is heavily overpopulated. As a result, the people on Earth now live in vast city complexes covered by huge metal domes—the “Caves of Steel”—each of which is capable of supporting tens of millions of people. Two of the main characters in this torrid tale are New York City police detective Elijah Baley and his partner R. Daneel Olivaw, who is forced on Elijah by the Spacers (the ‘R’ stands for robot).

Later, in The Naked Sun, we discover that the Spacer Worlds are rich and have low population densities. Elijah and R. Daneel are sent to solve a murder on Solaria, which is one of the richest worlds with the smallest population. Solaria’s population is so low, in fact, that each person lives in splendid isolation on his or her own estate served by tens of thousands of robots (the estates are hundreds or thousands of square miles in size). However, no one is lonely because they “visit” each other using a form of 3D holographic communication that allows them to virtually dine together, go for virtual walks together, and so forth.

Contrast this with the bulky AR and VR headsets we are obliged to wear today. The Holy Grail of user interfaces and displays is to have some form of true 3D holographic technology that would allow users to experience the metaverse without any form of headset whatsoever. (I bet that—after hearing the term “Holy Grail”—you, like me, are now thinking of the scene in Monty Python and the Holy Grail when King Arthur instructs the French soldier to tell his master that he can join them on their quest to find the eponymous Holy Grail, and the soldier responds, “I’ll ask him, but I don’t think he’ll be very keen, he’s already got one, you see.”)

The reason I’m waffling on about all of this is that I was just chatting with some folks from Imec and a brand-new startup company called Swave Photonics. Specifically, I was talking to Olivier Rousseaux, Director of Venturing Development at Imec; Xavier Rottenberg, Fellow of Sensing and Actuation Technologies at Imec (Xavier is the scientist who first came up with the idea for the technology we are about to discuss); Theodore Marescaux, CEO and founder of Swave; and Dmitri Choutov, COO of Swave.

Just to make sure we’re all tap-dancing to the same drumbeat, Imec describes itself as:

A world-leading research and innovation center in nanoelectronics and digital technologies. Imec leverages its state-of-the-art R&D infrastructure and its team of more than 5,000 employees and top researchers for R&D in advanced semiconductor and system scaling, silicon photonics, artificial intelligence, beyond 5G communications and sensing technologies, and in application domains such as health and life sciences, mobility, industry 4.0, agrofood, smart cities, sustainable energy and education.

The way I think of this is that Imec comes up with all sorts of amazing inventions and fundamental technologies, which they subsequently farm out to other companies—both established entities and fledgling startups—to be developed and brought to market.

This is where Swave comes in. This startup is literally just a few weeks old as I pen these words. It was formed to take the Holographic eXtended Reality (HXR) technology developed by Imec and use it to bring the metaverse to life. As the folks from Swave say in their own words:

Our HXR technology is the Holy Grail of the metaverse, delivering lifelike, high-resolution 3D images that are viewable with the naked eye, with no compromises. HXR technology enables 1000x better pixel resolution with billions of tiny, densely packed pixels to enable true realistic 20/20 vision without requiring viewers to wear smart AR/VR headsets or prescription glasses.

Swave’s HXR technology projects lifelike holographic images that eliminate today’s AR/VR/XR challenges of focal depth and eye tracking, so viewers can easily focus on nearby and faraway objects. Most importantly, the HXR chips are manufactured using standard CMOS technology, which enables cost-effective scaling. Leveraging advances in photonics and holography based on diffractive optics, Swave’s HXR gigapixel technology targets metaverse platforms, 360-degree holographic walls, 3D gaming, AR/VR/XR glasses, collaborative video conferencing, and heads-up displays for automotive and aerospace systems.

Swave technology can also power holographic headsets that deliver immersive 3D AR/VR/XR experiences with stunning high resolution, perfect depth of focus and 180-degree to 360-degree viewing angles, all without the headaches experienced by users of conventional headsets. Applications powered by HXR gigapixel technology will be capable of passing the visual Turing test in which virtual reality is practically indistinguishable from real-world images that humans see with their own eyes.

O-M-G! I cannot wait to see (no pun intended) this technology brought to fruition. When I read The Naked Sun, my young and impressionable mind was enamored by the concept of holographic displays, but I never thought to see something at the level of 3D holographic walls in my lifetime.

I also have to say that I envy the folks at Swave. The company boasted only six employees when I talked to Theodore and Dmitri, but this number may well have grown by now.

I joined a startup called Cirrus Designs way back in the mists of time we used to call 1981. I was the sixth person in, and I arrived the day after the desks and chairs, so everyone who was already there said that I was the lucky one. I look back on my time working at that startup as being one of the most exciting and fruitful periods in my life.

If I were wearing a younger man’s clothes, I would be pounding on Swave’s door asking for a job, but I’m juggling too many balls and spinning too many plates as it is. But what about you? If you are interested in working in this field, you really should check out Swave’s job postings on LinkedIn. And, even if you don’t want to work for Swave, are you ready to experience a true 3D holographic metaverse experience? I know I am because I just dispatched the butler to fetch my 3D holographic trousers to ensure I’m suitably attired.

One thought on “Did the Metaverse’s Holy Grail Just Arrive?”

Leave a Reply

featured blogs
Nov 12, 2024
The release of Matter 1.4 brings feature updates like long idle time, Matter-certified HRAP devices, improved ecosystem support, and new Matter device types....
Nov 13, 2024
Implementing the classic 'hand coming out of bowl' when you can see there's no one under the table is very tempting'¦...

featured video

Introducing FPGAi – Innovations Unlocked by AI-enabled FPGAs

Sponsored by Intel

Altera Innovators Day presentation by Ilya Ganusov showing the advantages of FPGAs for implementing AI-based Systems. See additional videos on AI and other Altera Innovators Day in Altera’s YouTube channel playlists.

Learn more about FPGAs for Artificial Intelligence here

featured paper

Quantized Neural Networks for FPGA Inference

Sponsored by Intel

Implementing a low precision network in FPGA hardware for efficient inferencing provides numerous advantages when it comes to meeting demanding specifications. The increased flexibility allows optimization of throughput, overall power consumption, resource usage, device size, TOPs/watt, and deterministic latency. These are important benefits where scaling and efficiency are inherent requirements of the application.

Click to read more

featured chalk talk

Advanced Gate Drive for Motor Control
Sponsored by Infineon
Passing EMC testing, reducing power dissipation, and mitigating supply chain issues are crucial design concerns to keep in mind when it comes to motor control applications. In this episode of Chalk Talk, Amelia Dalton and Rick Browarski from Infineon explore the role that MOSFETs play in motor control design, the value that adaptive MOSFET control can have for motor control designs, and how Infineon can help you jump start your next motor control design.
Feb 6, 2024
53,355 views