feature article
Subscribe Now

Putting the User First?

A mathematics professor was in full flow in a post-grad seminar. The board was covered in formulas, and, as he finished writing an equation, he said, “And from this, gentlemen,” (it is a very old story). “And from this, gentlemen, it is obvious…” and his voice died away. He stood there for a few seconds and then he sat down for a few minutes. He left the room and returned after fifteen minutes. Picking up the chalk he resumed, “And from this, gentlemen, it is obvious that…” and wrote another equation on the board and continued the seminar.

We all know that mathematicians are different, but, to some extent, so are engineers. And this brings us to the starting point for today’s sermon. What is obvious to one person is not to another. Even more, what is obvious to a developer of a product is often not at all obvious to the user of that product. And this is symptomatic of the way in which we often don’t think about the user during the whole cycle of product development.

Start with the development of a new product. It could be a consumer device, it could be a web site, or it could be something for other engineers to integrate into a bigger system. In many areas it has now become commonplace to use focus groups to try to understand what the customers want. Now just getting a group of people in a room and discussing what they think they want looks very sensible. And focus groups have their uses, if you are interested in what people are thinking. They are very good for gathering information about how customers view one company against another or for measuring their understanding of a new or emerging technology. They are also great for developing and monitoring marketing communications programmes, for example. And, properly used, they can identify the issues that people have. But, as Henry Ford is reported to have said, “If I had asked people what they wanted, they would have said faster horses.” Instead he identified what people needed – better transportation. No one at Sony carried out focus groups for the Walkman. And Steve Jobs is famously against them. But in all these cases the understanding of what customers will need drives a creative person or team to create products that meet, or go beyond merely meeting, customer needs.

Wittgenstein, the philosopher whose own thought processes are not straightforward or easy to understand, said, “Whereof one cannot speak, thereof one must be silent.” Until you have words for something, it is difficult to visualise it, to think about how you can use it, or to even know whether you will want it. A personal computer means nothing if the only computer you have ever seen is the behemoth of an IBM mainframe. And, even if you know computers can be far smaller, it still requires a leap of imagination to see how they can be used. Ken Olsen, of Digital Equipment Corp, certainly understood that computers could be smaller — the PDP11 could easily fit under a desk — but he is widely quoted as saying, in 1977, “There is no reason for any individual to have a computer in his home.” What he was attacking, in fact, was the idea that, in the home, heating, air conditioning, lighting etc, would be controlled by a central computer. What his imagination didn’t stretch to was the idea that processors would become so cheap that they would be widely embedded and linked, as is happening now. Olsen also said, “People will get tired of managing personal computers and will want instead terminals, maybe with windows,” predicting, as others have, the idea of the Cloud, being energetically puffed today by Google et al.

Back to the central thread: when thinking about a new product, the starting point should be defining what people need, whether they are consumers or the guys building the next embedded system. And I know it is easier to write that sentence than it is to actually define what people need.

Once the need is established, then the thinking should be extended to imagining how people are going to use the product. User interfaces are far too often added almost as an afterthought. (And designed by the people who built the product.) Instead, the user interface can be the starting point of the design: implementing what the user requires the system to perform rather than adding an interface that reflects what the developer has built. The user interface has to be tested by potential users, not the guy who is going to implement the system. And, ideally, it should be implemented by someone who understands human-machine interfaces and has at least a smidgen of design sensitivity. Apple’s iPhone and iPad interfaces look good and work well: many of the copies of them do neither.

Multi-layer user interfaces are a great approach, but they should have as their top layer what the user requires, not what the developer or the marketing group think is interesting, sexy or fun. Years ago, a book on Word for Windows claimed, “The toolbar you see [in Word] is not the toolbar that is most useful to you: it is the toolbar that Bill Gates and the marketing guys have decided will sell you Word.” And we all remember Clippy – Microsoft’s extreme expression of how the company always knows better than you what it is you want to do.

(As an aside – I have almost given up arguing with people who tell me that their interface is “intuitive.” Often what they mean is that users who have learned how to use the Windows interface will find it easier to use their product. Or that their engineers find it easy to use. In neither case is that intuitive – it is learned behaviour. To a child, red does not intuitively mean Stop, nor does green mean Go. We have learned it. This is probably a more widely learned reaction in a developed society than the Windows interface, but you can’t use the word intuitive to describe it.)

Working on the user interface often helps to make a clearer product definition. And testing it with a range of users, in formal or informal contexts, as well as making sure that the interface is useful, may well provide a better understanding of whether the product meets a user need.

Prior to the building of a product is also a good time to begin writing a user manual, even if it is to be implemented as help screens rather than as a printed document. The manual can be seen, perhaps, as an expression of the product specification in language the user understands, and it should not be written by a developer. Again, it should be test driven. There is no point in support telling a user with a problem to RTFM when the FM is F-unreadable.

If it is necessary for a developer to write the manual, perhaps because of time or budget constraints, then at least they should see if it can be translated into English (from engineer-speak) by a non-engineer and road-tested with a few potential users.

Error messages are another significant issue. “You have committed an illegal action,” may make sense to the person who wrote it, but, as an error message on a consumer item, it is not helpful and has caused serious concern, with naïve users expecting a police raid.

Things are improving in some areas. I have recently been talking to companies making chips for the automotive market, and they are talking, not just to the people who will buy their chips, the Tier One suppliers, but to the car makers themselves, and drawing on information that the car makers have on what they think drivers need. In the last few weeks we have seen chipmakers signing deals with ARM so that they can influence the longer-term definition of ARM’s products. Yet, as an industry, there is still too much of the thinking that says, “This is really cool — let’s turn it into a product and get rich.” Apart from the boring detail that turning something into a product takes at least as much time and money as building it in the first place, who is going to want it? What user need does it fill? And is the user going to be able to use it?

Leave a Reply

featured blogs
Nov 15, 2024
Explore the benefits of Delta DFU (device firmware update), its impact on firmware update efficiency, and results from real ota updates in IoT devices....
Nov 13, 2024
Implementing the classic 'hand coming out of bowl' when you can see there's no one under the table is very tempting'¦...

featured video

Introducing FPGAi – Innovations Unlocked by AI-enabled FPGAs

Sponsored by Intel

Altera Innovators Day presentation by Ilya Ganusov showing the advantages of FPGAs for implementing AI-based Systems. See additional videos on AI and other Altera Innovators Day in Altera’s YouTube channel playlists.

Learn more about FPGAs for Artificial Intelligence here

featured paper

Quantized Neural Networks for FPGA Inference

Sponsored by Intel

Implementing a low precision network in FPGA hardware for efficient inferencing provides numerous advantages when it comes to meeting demanding specifications. The increased flexibility allows optimization of throughput, overall power consumption, resource usage, device size, TOPs/watt, and deterministic latency. These are important benefits where scaling and efficiency are inherent requirements of the application.

Click to read more

featured chalk talk

Developing a Secured Matter Device with the OPTIGA™ Trust M MTR Shield
Sponsored by Mouser Electronics and Infineon
In this episode of Chalk Talk, Amelia Dalton and Johannes Koblbauer from Infineon explore how you can add Matter and security to your next smart home project with the OPTIGA™ Trust M MTR shield. They also investigate the steps involved in the OPTIGA™ Trust M Matter design process, the details of the OPTIGA™ Trust M Matter evaluation board and how you can get started on your next Matter IoT device.
Jul 2, 2024
31,968 views