editor's blog
Subscribe Now

Detecting Intuitive

A little over a year ago I went on a bit of a rant about intuitive design. Now… for those of you running for the door, I’m not going to reprise that rant. At least, not directly. But a comment at the recent Touch Gesture Motion conference got me thinking (always a dangerous thing), and from it came a new corollary conclusion.

The speaker noted that today’s phones were so intuitive that his 18-month-old could use them, and, in fact, that was a problem in case the child dialed Europe in the middle of the night. We all chuckled; cute.

But then I thought, “That’s not intuitive.” If that were really intuitive, then that means the 18-month old is waking up in the middle of the night going, “OK, cool: Dad’s asleep now. I can make that call to Europe I’ve been wanting to do. Let’s see, how would I do that? Let me try… this. Ah! Success!”

In other words, intuition suggests having intent and then knowing without learning how to achieve the goal. That’s not what the 18-month old is doing.

As I suggested in the prior piece, a baby is in super-learning mode. They try everything – touch, swipe, taste, smell, bang, drop – and watch what happens, duly filing away the results. Yes, they need lots of repetitions to convince themselves that, for example, gravity does indeed work every time (as any parent who has tried to keep the toys or dinner off the floor can attest). They have no idea what they’re doing in many cases: they just try something and see what happens.

They also do try to swipe and touch screens and such. Why? Because they see us do that. So they’re not undertaking intuitive actions; they’re learning how things work – the opposite of intuition. And I kind of drew this conclusion in the earlier piece.

But then it occurred to me: if you really want to test out intuitive, babies are NOT the right model. If you can give your grandparents a new device and he or she can immediately figure out how to use it, then you’ve got something intuitive. Unlike the baby, they don’t want to have to learn some new way of doing things; they just want it to work.

My guess would be that, by that standard, there are precious few intuitive interfaces. Because how many grandparents have been able to get going without asking for help from the grandkids?

(Yes, I know, this notion – however rational – will be, if even noticed, duly ignored by the rush to keep convincing ourselves that we have intuitive stuff. Ah groupthink…)

Leave a Reply

featured blogs
Jul 25, 2025
Manufacturers cover themselves by saying 'Contents may settle' in fine print on the package, to which I reply, 'Pull the other one'”it's got bells on it!'...

featured paper

Agilex™ 3 vs. Certus-N2 Devices: Head-to-Head Benchmarking on 10 OpenCores Designs

Sponsored by Altera

Explore how Agilex™ 3 FPGAs deliver up to 2.4× higher performance and 30% lower power than comparable low-cost FPGAs in embedded applications. This white paper benchmarks real workloads, highlights key architectural advantages, and shows how Agilex 3 enables efficient AI, vision, and control systems with headroom to scale.

Click to read more

featured chalk talk

Improving the Cockpit Computer using Companion Microcontroller
Sponsored by Infineon
Companion microcontrollers are a vital element of today’s complex automotive designs. In this episode of Chalk Talk, Matthew Goodavish from Infineon and Amelia Dalton investigate how the architectural evolution in automotive design has encouraged the need for companion microcontrollers, the role that safety islands play in the development of these systems, and the core system benefits that companion MCUs bring to these kinds of designs.
Jul 10, 2025
19,887 views