There’s a lot of focus these days on the power of computer vision, and much of that is being directed at recognizing gestures so that we can interact with our machines without touching them or communicating our intent out loud for everyone else in the room to hear.
We looked at Plessey’s EPIC sensor before; it detects fine changes in electric fields. It now has a gesture library associated with it, to be shown this week at CES. This application literally interprets the minute electrical field changes caused by your waving hands.
Back in November, Microchip announced their own e-field gesture sensor. Unlike the EPIC sensor, which works with the earth’s e-field, this device generates a local e-field and then measure its disturbances.
Meanwhile, Elliptic Labs is exploiting yet another phenomenon: sound – ultrasound, to be precise. (So you don’t bother your cubemate, just your cubemate’s dog.) This relies on a special speaker/microphone set that you can layer over the computer screen or integrate into the bezel (the existing speakers and mike might work, but not so well). It not only uses the reflections from your hands to interpret your motions, but the signals are also encoded so that it can detect and “bind” to other machines into what they call “social gestures.”
You can find out more about Elliptic’s technology in their release; likewise for Microchip’s sensor. Plessey’s note was a lure to their CES booth, so that’s where you can get a live demo if you happen to be in town.