I often think of a future that involves truly intelligent robots working alongside humans to make the world a better place for all of us. My wife (Gina the Gorgeous) is constantly asking me how long it will be until we have our own robot to help with household tasks. I have no idea how to answer this question. Looking back over the last 40 years, some of the things we — engineers, scientists, and technologists — thought were going to be hard turned out to be much easier than we expected. Contra wise, some of the things we predicted were going to be relatively easy turned out to be significantly more difficult to achieve than we anticipated. On this basis, all I can say is that truly intelligent robots may arrive much sooner or much later than we think (and you can quote me on that).
What do you think of when you hear the term “robot”? Does your mind turn to the unintelligent, fixed-in-place behemoths found in places like car assembly plants? Or do things like the marvels coming out of Boston Dynamics spring to mind? All I can say is it’s embarrassing to discover that a robot can dance better than you (as seen in this video), especially when — as in my case — your father was a professional dancer on the variety hall stage.
For myself, all sorts of thoughts start bouncing around my poor old noggin. For example, I think of Isaac Asimov’s classic stories The Caves of Steel and The Naked Sun. These tales, which are set about three thousand years in our future, feature a robot character called R. Daneel Olivaw, who can pass for a human. (I also think of Asimov’s novelette The Bicentennial Man, which involves a robot called Andrew who is trying to have himself acknowledged by the world government as being a human being).
The Humans TV series comes to mind as does the Westworld TV series (and, of course, the 1973 Westworld movie starring Yul Brynner). It’s also fair to say that it will be a long time before I manage to forget some of the scenes in the 2021 post-apocalyptic science fiction thriller film, Mother Android. And, of course, we certainly cannot neglect one of my all-time favorites, Great Sky River by Gregory Benford. This odyssey is set tens of thousands of years in the future, when humans expanding through the galaxy encounter ancient mechanoid civilizations that aren’t overly happy to meet and greet us.
My meandering musings above were triggered by the fact that I was just chatting with Keith McMillen, who is the Founder and CTO of BeBop Sensors. I’m fortunate in that I get to meet all sorts of amazing and interesting people as part of writing my columns, but it has to be said that Keith stands proud in the crowd. I cannot begin to tell you all of the stuff he told me, including the companies he founded and sold, like Zeta Music, which revolutionized stringed instruments (you should hear their electric violins) and which was acquired by Gibson Guitars in 1992.
Suffice it to say that Keith is one of those rare serial inventors who recognizes a need and conceives a solution before anyone else. By comparison, I’m like most of us in that I often come up with amazingly clever ideas, only to discover that someone else (someone like Keith, now I come to think about it) got there first.
The reason I’m waffling on about all of this here is that Keith’s latest and greatest invention is a smart fabric sensor technology that can be deployed for all sorts of tasks, including human-machine interfaces (HMIs), digital health, and alternative realities such as augmented reality (AR), virtual reality (VR), and augmented virtuality (AV) (see also What the FAQ are VR, MR, AR, DR, AV, and HR?).
Of especial interest in the context of this column is BeBop RoboSkin, which is an incarnation of Keith’s smart fabric that can be used to create a skin-like covering to provide humanoid robots with tactile awareness that exceeds the capabilities of human beings with respect to spatial resolution and sensitivity.
Robot hand equipped with BeBop RoboSkin picking up a ball
(Image source: BeBop Sensors)
The RoboSkin itself is formed from a polyester-nylon non-woven fabric that’s less than 1mm thick. Using a special process, conductive nanoparticles are ionically bonded to the outside of every fiber. When the fabric is deformed — say by pressing it onto something — its resistance/conductivity changes accordingly.
The clever part here is that the fabric can be literally cut (or formed) to the desired shape, because the actual sensing elements that are used to detect the changes in the fabric’s conductivity are mounted under the fabric — on the robot’s fingertips, in this particular example.
In this case, there are 80 taxel sensors (think “tactile pixels”) in each fingertip, presented in an array with a 2mm x 3mm pitch, and all of the sensors in all of the fingers can be scanned thousands of times a second. Keith says that humans have about a 4 mm pitch with respect to the nerves in our fingertips. When I first heard this, I thought, “That can’t be so,” but Keith assured me that there’s an easy way to test this. Take two plastic rods with curved tips — like knitting needles, for example — hold them side by side, tell someone to close their eyes, and touch the tips of the rods to the tip of the subject’s finger. It’s only when the distance between the tips of the rods is 4mm or more that the subject can distinguish them as being separate.
One of the examples Keith presented that really blew my mind was a robot finger reading Braille. As soon as I saw this, I remembered trying to read Braille myself, and being completely unable to distinguish the number and relative locations of the raised dots that were under my finger.
Robot finger equipped with BeBop RoboSkin reading Braille
(Image source: BeBop Sensors)
When you come to think about it, we hear a lot on thew news these days about equipping robots with human-like perception in terms of sight and sound. Adding a sense of touch would seem to be a logical development (Mr. Spock would be pleased).
Keith says that, in addition to pressure, RoboSkin can detect and identify dragging and shearing forces, and it can even be used to perceive different textures. A large part of this involves the use of artificial intelligence (AI) and machine learning (ML) to make sense of the cornucopia of sensory data, if you see what I mean.
I like a pun as much as the next humanoid-shaped entity, but even I winced when I saw the title of this video, which is BeBop RoboSkin “Making Robots Feel Better” (arrggghhh).
To be honest, what I laughingly call my mind is still reeling with everything I heard. At the same time, my noggin is buzzing with ideas for where BeBop Sensor’s smart fabric can be used in general, and where BeBop RoboSkin can be used in particular. What say you? Do you have any thoughts you’d care to share on any of this?
One thought on “BeBop RoboSkin Provides Tactile Awareness for Robots”