feature article
Subscribe Now

Do Robot Dogs Dream of Cyborg Cats?

Were you wondering if anyone was in the process of using decentralized artificial intelligence (AI) to create a pack of robot dogs? If so, I can put your mind to rest by informing you that the answer to your question is a resounding “Yes!”

Did you notice that I used the term “decentralized AI” as opposed to “distributed AI” in the preceding paragraph? If so, were you also wondering about the difference between decentralized and distributed AI? Should such be the case, then you’ll be happy to know that you aren’t alone, because I’m wondering about that too.

The reason I dropped this topic into the conversation — like plopping a pebble into a pond — is to watch the metaphorical ripples gently spread out (I find metaphorical ripples are much more relaxing than the other kind). The thing is that I was just chatting with John Suit, who is the advising CTO (chief technology officer) at a robotics company called KODA. John was kind enough to enlighten me by explaining that “decentralized AI and distributed AI are slightly different concepts, where said differences depend on the implementation; also, these may be largely regarded as philosophical differences, at least in potential.” Well, all I can say is that I’m glad we managed to clear that one up — I’ll certainly sleep easier in my bed tonight.

John went on to say that he sees decentralized AI as the key change that will turn the current static IoT system into an intelligent one, capable of making decisions and changes in real-time without the delays associated with human influence or oversight.

In a nutshell, the folks at KODA are building a new generation of consumer electronics that layer distributed AI and blockchain technology to create smart, secure networks of devices that collaborate and learn with minimal human input. The first of these products is a robot dog called KODA (what are the odds?).

Every now and then I see some new technological wonder that sets a cacophony of thoughts and ideas ricocheting around my poor old noggin. Well, it’s fair to say that such an occurrence happened when I was introduced to KODA (both the dog and the company).

I didn’t even know I wanted a robot dog until I saw KODA. Now I can easily envisage having my own KODA sitting here in my office while I work, only waiting for me to glance in its direction and raise a quizzical eyebrow to indicate that it’s time for him to take me out for a walk.

Let’s start with the fact that KODA is described as being “a social robot dog.” What do we mean by “social” in this context? Well, unlike other robot dogs of this general type that are on the market, the folks at KODA say that KODA is designed to be functional from both pragmatic and emotional perspectives. When you call KODA’s name, it will recognize your voice, turn in your direction, recognize your face, and determine your mood of the moment. Furthermore, this is, in part, why KODA has a head because when a KODA cocks its ear to its owner’s voice and runs over to be close, its owner will know the dog heard and understood what was expected of it.

Koda is a social robot dog (Image source: KODA)

As an aside, way back in the mists of time we used to call 1968, the American writer Philip K. Dick published his classic science fiction novel Do Androids Dream of Electric Sheep? This somewhat grim tale is set in a post-apocalyptic San Francisco. Earth’s life has been greatly damaged by a nuclear global war leaving most animal species endangered or extinct. The main plot follows Rick Deckard, a bounty hunter who is tasked with “retiring” (i.e., killing) six escaped Nexus-6 model androids — robot servants identical to humans — that recently escaped from Mars and traveled to Earth. Deckard hopes to earn enough bounty money from dispatching the androids to buy a live animal to replace his failing pet, which is an electric sheep.

This story formed the basis for Blade Runner, which I personally class as one of the greatest movies of all time. I must have seen this film on 30 or more occasions, and I discover something new every time I watch it. The aesthetics and atmospherics, augmented by an astounding soundtrack, are impactive, disturbing, and poignant — all at the same time. When the leader of the androids (called “replicants” in the movie), Roy Batty, played by Rutger Hauer, gives his “Tears in Rain” soliloquy near the end — “I’ve seen things you people wouldn’t believe. Attack ships on fire off the shoulder of Orion. I watched C-beams glitter in the dark near the Tannhäuser Gate. All those moments will be lost in time, like tears in rain. Time to die” — all I can say is that I’m wracked with emotion.

I could waffle on about Blade Runner for hours (I know you’re surprised), but I fear we are in danger of wandering off into the weeds (again, I know you’re surprised). Things have certainly come a long way on the robotics front since the introduction of the Furby, which was originally released in 1998. OMG: that’s 22 years ago as I pen these words. I once met one of the guys who created the Furby and it was amazing to hear how this little rascal came to fruition, including the creation of its language, which almost sounded like it was something you should understand, especially when two or more Furbies started chatting with each other.

Today, there are a bunch of cheap-and-cheerful robot dog toys around, some of which sing and dance and respond to voice commands (like this little scamp for $55.99), but these are relatively simple devices that would be entertaining only to little people from 2 to 10 years old — and to bigger people like me, of course.

KODA is a horse (well, dog) of a different color. In addition to four high-resolution video cameras, plus a 13-megapixel camera on the front of its head with which it can capture high-quality photographs, KODA is replete with an abundance of high-torque motors, a cornucopia of state-of-the-art sensors, and a profusion of processors in the form of CPUs, GPUs, DSPs, and FPGAs, resulting in 11 teraflops of raw processing power. As a result, KODA is equipped with binocular vision, binaural hearing, haptic touch, the ability to smell and taste, and environmental sensing. I was just thinking that KODA boasts more sensors and processors than you can swing a stick at (“stick”… “dog”… get it? Tell your friends I’m playing here all week).

KODA is a robot dog with personality (Image source: KODA)

Sad to relate, the folks at KODA are reluctant to release too much about their technology prior to the official launch, whenever that might be (which should give you a clue as to how little I actually know). What I do know is that KODA takes full advantage of the InterPlanetary File System (IPFS), which is both a protocol and peer-to-peer network for storing and sharing data in a distributed file system.

According to Wikipedia, “IPFS uses content-addressing to uniquely identify each file in a global namespace connecting all computing devices. IPFS allows users to not only receive but host content, in a similar manner to BitTorrent. As opposed to a centrally located server, IPFS is built around a decentralized system of user-operators who hold a portion of the overall data, creating a resilient system of file storage and sharing. Any user in the network can serve a file by its content address, and other peers in the network can find and request that content from any node who has it using a distributed hash table (DHT).”

All this ties in with something else John told me, which is that when your KODA is “sleeping” (a.k.a. recharging), it can “dream” (a.k.a. use it’s computational capabilities) to perform a variety of tasks, from mining bitcoins — thereby allowing KODA to pay for its own keep — to taking part in the ever-growing list of distributed computing projects.

Something else John mentioned was the concept of “ephemeral memory loss.” This one is a little tricky to wrap your brain around, especially when you are as forgetful as your humble narrator, but the idea — as I understand it — is that when we learn how to perform some task (like discovering how to walk as an infant), we only keep hold of the core concepts while much of what we master is fugacious at best. As a result, we end up learning the same thing over and over again in slightly different ways, eventually honing in on an optimal solution, and we keep on learning and forgetting and relearning in the future in response to any changes in our environments.

Apparently KODA has been imbued with this ability. Furthermore, every KODA is part of the pack, so if one KODA grows up in a beach hut on a pacific island and knows only sunshiny days and how to walk on sand, while another KODA is raised at a ski resort in Colorado where it is obliged to learn how to navigate through a snowstorm while walking on ice, then all of their accumulated knowledge will be shared via the cloud, and both KODAs will have access to each other’s skills should the occasion arise (this is one example of where the decentralized AI comes into the picture).

Quite apart from anything else, its mannerisms make KODA as cute as a button, with the resulting danger that people could easily start to form emotional attachments with this little beauty.

People could easily form emotional attachments to their KODAs (Image source: KODA)

If I haven’t already made this clear, I should perhaps make note of the fact that KODA is still under development and isn’t an actual product… yet. Having said this, the folks at KODA were keen to impress on me that KODA is going to be a real artifact, and not simply a proof of concept. However, having said this, I can absolutely see that all of the concepts and technologies coming out of KODA will be applicable to a diverse range of other applications and products in the future. As always, I would be interested in hearing your thoughts on all of this. Meanwhile, I’m going to start rummaging around my office looking for loose change and hidden treasures in the hope I can rustle up enough to place a down payment on my very own KODA.

2 thoughts on “Do Robot Dogs Dream of Cyborg Cats?”

  1. Max,

    News like this tends to cause me to “toss and turn.”

    Knowing from observing history how well the human race handles anything, news like this can be disturbing to me.

    No thoughts of “sugar plums”, and “fairies” dance in my head but vivid dreams of “Terminator” and “I Robot” furiously swirl.

    After all we already have cars that tend to automagically crash into inanimate objects or perform automated flame thrower imitations.

    Fred

    1. Hi Fred — I know what you mean — on the one hand, I think this stuff is tremendously exciting — on the other hand, I see a future with me cowering in the ruins hiding from packs of marauding robot dogs. Have you been watching that TV series “Next”?

Leave a Reply

featured blogs
Dec 19, 2024
Explore Concurrent Multiprotocol and examine the distinctions between CMP single channel, CMP with concurrent listening, and CMP with BLE Dynamic Multiprotocol....
Dec 20, 2024
Do you think the proton is formed from three quarks? Think again. It may be made from five, two of which are heavier than the proton itself!...

Libby's Lab

Libby's Lab - Scopes Out Littelfuse's SRP1 Solid State Relays

Sponsored by Mouser Electronics and Littelfuse

In this episode of Libby's Lab, Libby and Demo investigate quiet, reliable SRP1 solid state relays from Littelfuse availavble on Mouser.com. These multi-purpose relays give engineers a reliable, high-endurance alternative to mechanical relays that provide silent operation and superior uptime.

Click here for more information about Littelfuse SRP1 High-Endurance Solid-State Relays

featured chalk talk

Developing a Secured Matter Device with the OPTIGA™ Trust M MTR Shield
Sponsored by Mouser Electronics and Infineon
In this episode of Chalk Talk, Amelia Dalton and Johannes Koblbauer from Infineon explore how you can add Matter and security to your next smart home project with the OPTIGA™ Trust M MTR shield. They also investigate the steps involved in the OPTIGA™ Trust M Matter design process, the details of the OPTIGA™ Trust M Matter evaluation board and how you can get started on your next Matter IoT device.
Jul 2, 2024
32,004 views