Designing software for ease of use is hard and as developers we seldom use our own software extensively when it has been delivered. But we are still expected to be able to put ourselves in the users seat and know how they work, to have the application work as seamlessly as possible and actually be a help in their work. This can be hard!
We are, however, not the only people in with this problem. To give some food for thought we can take a look on how ordinary house hold appliances work. I’m pretty certain that the typical developers of washing machines, tumblers, stoves and even to some extent TVs, are not the typical users of them at home.
Be aware of dangerous default settings.
I had a washing machine which, when turned on, had a default washing program where the temperature was 60 degrees (Celsius). If you weren’t aware of this you could easily ruin some clothes.
Don’t disable settings for no particular reason.
My current washing machine has a quick wash program, which takes 15 minutes on 30 degrees. But for some peculiar reason the machine can only sling the clothes at 1200 RPM instead of the maximum setting which normally is 1600 RPM. But on the quick wash program it is simply not possible to select higher than 1200 RPM. So in other words: I want my clothes washed in a hurry but I have plenty of time to wait for it to dry?
Make sure that settings have obvious names.
My tumbler has two buttons, which – (probably inaccurately) translated from Danish – read: Delicate and Protect. And I cannot if my life depended on it remember what the difference between these two are. I’m sure at least one of them makes the tumbler dry the clothes with a lower temperature. But to figure out which one I have to look in the manual. Every time.
The user should be able to see what the application is doing.
My stove, which is a ceramic cooker, has some lights that indicate whether the cooking areas are “too hot” (to touch). But it has no indicator telling me whether any of the cooking areas are actually turned on – or off, for that matter. I have to manually inspect all the knobs to determine this.
Let the user multi-task if the scenario allows for it.
My LG TV has a Electronic Program Guide, which is fairly normal these days. But the brilliant designers has determined that I, when using the EPG, don’t want to listen to the program I was watching. The sound is disabled when I open the EPG. This seems like an odd decision (and it is very likely founded in something technical), since the sound would allow me to follow along in the program for some seconds, while reading up on what I’m actually watching 😉
These few examples show that we, simply by looking around, can spot usability problems. But how can we avoid them? Well, we will probably have to talk with our users about that… 😉