feature article
Subscribe Now

Touch Redux

It’s weird when engineers get all touchy-feely.

Yet that’s what’s happening, especially in cell phone design, tablets, and today’s coolest user interfaces. Touch-sensitive screens are all the rage, and it looks like a trend that’s here to stay.

Is it, as Angus Young might bawl, a touch too much?

Atmel and Cypress sure hope not. Both companies (plus a handful of others) are making hay while the sun shines. They’re shipping touch-sensitive controllers out the door as fast as the semiconductor elves in the hollow tree can bake new ones. Touch screens have been a boon to these companies, and both are investing heavily in research and filling their product catalogs with new and interesting devices.

Atmel quite reasonably brags that 9 of the 10 best cell phones (according to the November issue of PC World) all use its touch-sensing chips. The exception, naturally, is Apple. That’s a pretty impressive endorsement of its products, and Atmel is doing its best to keep the ball rolling.

It’s also an impressive ramp rate for touch-screen interfaces in general. Atmel went from having practically no touch-screen chips in 2009 to shipping $140 million worth of the little buggers in 2010. Zero-to-140M in 12 months is pretty face-stretching acceleration.

Touch screens are so popular in part because they’re cool. Not since Star Trek: Next Generation have we seen people rubbing their hands over their computer consoles to elicit beeps and squeaks. Now all we need is holographic displays to get the full Minority Report effect.

Touch screens also save space, of course, which is really why they’re so popular on cell phones. Phones with real keyboards (albeit Lilliputian ones) have become passé. Real hipsters leave fingerprints on the screen. And touch screens enable some novel user-input gestures that you couldn’t do with a mouse. “Pinching” a picture doesn’t work without two fingers on the screen at once, something you can’t really do any other way. “Flipping” pages by swiping a finger is so much more gratifying than just pressing PgDn.

All of this GUI goodness requires some elaborate engineering, though, and that’s an area where most of us find our EE backgrounds lacking. Capacitive touch sensing (as opposed to the resistive kind) relies on oh-so-subtle detection of ephemeral phenomena, something that’s quite outside the realm of most digital engineers’ experience. That’s where ready-made touch controllers come in.

Atmel’s MaxTouch line of chips does all the heavy lifting, so to speak. Each chip is more or less divided in half, with half handling the delicate analog front end while the other half is a conventional microcontroller. The analog portion (no surprise) talks directly to your product’s screen—or more accurately, to the air space above it—while the MCU half translates each disturbance in the force into bits and bytes more familiar to the typical digital designer.

Different MaxTouch chips support different touch resolutions, which isn’t the same thing as screen resolution. For example, you can have as few as 224 “nodes” of touch sensitivity on top of your 1024×768 screen, or as many as 768 nodes. It’s all a matter of how fine-grained you want your touch interface to be. The lower node counts are probably only feasible for small screens (e.g., cell phones), while the 768-node version is fine for an iPad-like tablet. On the other, um, hand, if you’re making an industrial user interface with big, fat buttons for users wearing gloves, a lower node count over a big screen may make perfect, uh, sense.

Between these two extremes, Atmel makes a 384- and a 540-node controller. Most of them have a 32-bit AVR microcontroller inside; only the wee little 224-node chip uses an 8-bit MCU.

Starting this week, the company is rolling out improved E-series versions of its touch-sensing chips. Atmel has upgraded the analog front end to be even more sensitive to fingers while also being less sensitive of outside interference. (The microcontrollers stay the same.) The improved noise rejection is a good thing, especially for first-time touch engineers, because that’s the biggest problem that rookies tend to have. Capacitive user interfaces are notoriously, er, touchy, and they’re difficult to debug. Radiated noise from power supplies, RF components, and even fluorescent lighting can upset the delicate sensibilities of the touch-screen interface. Isolating the analog front end from these sources of interference can be tricky, especially when you’re working on a small product. Atmel has beefed up its analog filtering and tweaked the onboard firmware to reject extraneous noise while still detecting bona fide finger tips. And taps.

With touch-screen interfaces getting easier to design, it’s inevitable that they’ll become more prevalent. What happens then? Will we see keyboards disappear, and if so, would that be a bad thing? Perhaps physical “Chiclets” keyboards (think BlackBerry) will forever give way to on-screen keyboards (think iPhone), at least on smaller devices. Or maybe handwriting recognition will make a comeback—although, ironically, capacitive touch screens typically can’t recognize a pencil or stylus. They rely on biological effects, so inanimate pointers don’t work. Maybe we’ll all wind up signing our names with our fingertips instead of a pen. Finger-painting for grownups.

The next step might be multi-finger input methods, somewhat like playing the piano. Modern keyboards were developed in an age where one letter at a time was all the machine cold handle. (And sometimes not even that; the QWERTY layout was reportedly created to be deliberately awkward to use, to prevent rapid typing that jammed the machine.) There’s nothing inherently correct about sequential one-finger typing. Maybe our descendants will learn to “type” with multiple fingers at once. (Mine can already do it using only their thumbs.) After more than a century of heavy use, the QWERTY keyboard may now be on its way out, thanks to touch screens.

Time to stand up, tug on your shirt, and say, “make it so.”

Leave a Reply

featured blogs
Nov 15, 2024
Explore the benefits of Delta DFU (device firmware update), its impact on firmware update efficiency, and results from real ota updates in IoT devices....
Nov 13, 2024
Implementing the classic 'hand coming out of bowl' when you can see there's no one under the table is very tempting'¦...

featured video

Introducing FPGAi – Innovations Unlocked by AI-enabled FPGAs

Sponsored by Intel

Altera Innovators Day presentation by Ilya Ganusov showing the advantages of FPGAs for implementing AI-based Systems. See additional videos on AI and other Altera Innovators Day in Altera’s YouTube channel playlists.

Learn more about FPGAs for Artificial Intelligence here

featured paper

Quantized Neural Networks for FPGA Inference

Sponsored by Intel

Implementing a low precision network in FPGA hardware for efficient inferencing provides numerous advantages when it comes to meeting demanding specifications. The increased flexibility allows optimization of throughput, overall power consumption, resource usage, device size, TOPs/watt, and deterministic latency. These are important benefits where scaling and efficiency are inherent requirements of the application.

Click to read more

featured chalk talk

Easily Connect to AWS Cloud with ExpressLink Over Wi-Fi
Sponsored by Mouser Electronics and AWS and u-blox
In this episode of Chalk Talk, Amelia Dalton, Lucio Di Jasio from AWS and Magnus Johansson from u-blox explore common pitfalls of designing an IoT device from scratch, the benefits that AWS IoT ExpressLink brings to IoT device design, and how the the NORA-W2 AWS IoT ExpressLink multiradio modules can make retrofitting an already existing design into a smart AWS connected device easier than ever before.
May 30, 2024
34,315 views