Last year I asked a group of students to design and build a pleasurable user interface for Marijn Meijles. Marijn is a computer programmer who has difficulty with fine motor control. He prefers to use his keyboard instead of his mouse to control his computer. The first prototypes my students made were based on the assumption that keyboard navigation is primarily done with the Tab-key.
Here’s a video of a fancy looking interface that’s being navigated with the tab key.
After one week Marijn came over to our university to test the prototypes. When we tested the first iterations of these beautiful, Tab-key optimised interfaces with Marijn it turned out he never uses the Tab-key. Instead he uses a combination of the arrow keys, his space bar, the Enter key and his trackpad.
Here’s a video of Marijn not seeing any of the fancy tab-key interactions.
The new insights about how Marijn uses his computer were gained after one week. This gave my students enough time to create well considered, and tested working prototypes of user interfaces that were easy to use for Marijn.
Here’s a video of an interface that’s been optimised specifically for how Marijn uses his keyboard. You can test it for yourself by using your arrow keys on this website.
This research didn’t invalidate the assumption that keyboard users don’t use their tab key. What it did show is that keyboard usage can be much more complicated than that.
Some of the solutions my students created were variations of spatial navigation, a type of navigation often seen in environments like TV-menus, tailored to Marijn’s situation. Other solutions focused on the reason why he uses his keyboard in a specific way. This is described in more detail in the chapter about adding nonsense.
This case nicely illustrates the first Exclusive Design principle which says that we should study situation. The assumptions we had about keyboard usage turned out to be insufficient to create an interface that works for Marijn. We really needed to observe how he uses his computer in order to come up with something that works.
The over-simplistic idea of how people use their keyboard was not the only assumption that proved to be false, or half-true.
It is good practice to use semantic HTML on the web. One of the reasons why you should is that semantic meaning makes it easier for blind people to understand the function of elements, and the structure of a webpage. I have been teaching this to my students for years, and I’ve always done my best to use proper semantics in the websites I built. One of the more shocking observations during my research was that for certain screen reader users certain semantics are more confusing and distracting than helpful. Structural semantics, like heading levels and website navigation, are spoken out loud by screen readers. Instead of helping, this extra information makes things more complicated when a user doesn’t understand what a heading level or a navigation is for. With this idea I created a prototype of a website for Simon Dogger without using any structural semantics. Again, I got these new insights by closely studying situation, which in this case consisted of observing how Simon Dogger and Hannes Wallrafen use their computer. More details on this in the chapter about designing like it’s 1999.
One of the first reactions on Twitter to the very first episode of my podcast was
very nice, but I am deaf.1 Blushing with shame I published a transcript a week later. All episodes have been published as audio and a transcript ever since.
I assumed that all deaf people who were interested in my topic would be happy now. Until I saw a presentation by Marie van Driessche.2 She explained that sign language uses its own grammar and its own structural logic, which makes it very hard for people who were born deaf to understand written texts. And when this text is a literal transcript of spoken language, it’s even harder.
This is one of those issues that are hard to solve. The idea behind publishing a literal transcript is that it resembles the original most accurately. Other types of transcript, where you transform spoken language into proper written language, will always require some sort of interpretation. On the other hand, these types of transcripts could be more accessible to some people. A possible tool to solve these kinds of issues might be the priority of constituencies that I explain in the chapter about stress cases. Another option would be to use an official sign language transcript instead of a written one.
Again, this is an illustration of the fact that assumptions alone are not enough. You need to study situation, in this case how Deaf people may perceive your transcript. It’s also a nice illustration of the third principle, which says that you should prioritise identity. In this case, actively using the identity of people who are Deaf, actively designing with them, might result in better inclusive interfaces for things like podcasts.
There are all kinds of assumptions we have about blind people. I remember the first time I met Léonie Watson during dinner, the day before a conference in Zurich. I told her about a script I had written which translates colour codes into spoken language. For me this was an entertaining conceptual exercise, I didn’t really think of it as a real solution, because, as I asked Léonie,
what use is colour to the blind? To which Léonie answered that there’s no such thing as the blind. There are people who were born completely blind, for whom colour may be a hard to understand concept. There are people who are partially blind, who may be able to see some colour. There are people like Léonie who became blind at a later age who still have a memory of colour. To name just a few. So yes, this tool I created may very well be handy to many blind people. It might be handy for a colourblind art director I used to work with in the past as well.
In the past I’ve heard similar assumptions in brainstorm sessions with design teams.
Blind people don’t use our service is an almost logical assumption when you create a website with videos.3 In these cases, instead of assuming, we should ask the people we assume things about. Again, this illustrates the first and the third principles, about studying situation, and about prioritising identity.
In the next chapter called More death to more bullshit I will talk about the weird assumption I had that all screen reader users are computer experts.
Eva Westerhoff. Tweet. 2016. twitter.com/
Where’s that example about a blind father buying a car for his kid? ↩