User Testing: The Good Conversation

User insight has been a mainstay for designers for a long time, and today it is obvious to everyone that insight about the user's situation, motivation and needs is a prerequisite for being able to design products that live long and well.

Audun Støren
Design

How to get user insights?

It may seem like it's easier said than done. In my experience, few business leaders have conducted or attended an in-depth interview with their end customers, and it is relatively rare to see a Norwegian corporate and development culture based on user testing and user insights.

Towards the end of 2018, Increo (formerly Uredd design agency) began to invest heavily in facilitating and facilitating rapid design processes that include user testing, known as Google Design Sprints.

The method makes it easy to conduct in-depth interviews with actual users at an early stage in the design process. Both designers, product owners and others involved observe the in-depth interviews. It turns out that insight often becomes very indicative of further work. In particular, the interviews have a good effect on management's view of the definition and value of user insights.

After conducting over 40 user interviews and user tests in 2019, I have made myself some reflections that I would like to share.

Identify your needs or optimize?

Briefly, we can divide user tests into two main categories; formative and summative evaluation.

“Formative evaluation” lends itself well to the start of a design project. It helps us find out what works well with the design, and why. Through interviews and task-based testing, we gain an understanding of the user's experience of the unfinished product. We get answers to what problems the product should solve and whether the user's needs are being met.

“Summative evaluation” is suitable for optimizing ready-made solutions, and is probably what people most often think of when we talk about user testing. This form of testing consists of analyzing larger amounts of data, such as visitor numbers on a web page or a questionnaire. The test results give us the basis to optimize the product and improve the overall user experience.

In the context of design sprints, formative evaluation is concerned. Preferably in the form of in-depth interviews.

Five interviews are enough

What kind of insight are we really looking for in these in-depth interviews? I have a background in advertising and have been there often target group descriptions used to describe the user's needs and behavior. But such descriptions are only a cross-section of a group — not an understanding of the individual. They say little about what people actually thinking and what motivates them when they use the product or see the visual expression we have designed.

For example, let's say you sit with the statistics from your website in front of you. You see that people are on the page for two and a half minutes on average — one minute longer than before a design change was made. Success! you might be thinking.

But what do you really know about the reason for that extra minute?

Could it be that the user does not find what they are looking for and that the design change has only added frustration?

We need to ask the user directly. And we don't have to talk to very many, either. Often is five people enough to uncover flaws and shortcomings -- and are we extra smart, we talk to the end users before a single line of code is written.

It starts with pretending

We will avoid working for a long time on ideas and designs that we do not know if work. Therefore, the goal is to create something that can be tested in as short a time as possible. Google Design Sprint is a well-suited tool to do just that. In 3—4 days, a multidisciplinary team can work out an idea culminating in a prototype ready for testing.

Skjermbilde av eksempel på en protoype
Screenshot from a prototype designed at Increo (formerly Uredd). You should not spend more than one day creating the prototype. Therefore, it is advisable to use template works and “steal” items from other sites.

My experience is that the key to a successful prototype is that it looks as real as possible. A user test that assumes that the user must imagining how the finished product will look can often make for a challenging interview situation. Then one has to spend time explaining the premises, and what the sketch is supposed to envision, rather than spending the time gaining valuable insights about the user's wishes and priorities.

The simplest trick to make a prototype look real is to steal items from other products or make use of template works. Material Design and Apple Design Resources are great templates for creating prototypes that look awesome.

Other useful resources when creating prototypes is the image database Unsplash, free icons from Flaticon and the tool Figma for design and prototype.

The hard part of the job

Surprisingly, obtaining test subjects for a user test is quite challenging. If you work on behalf of a customer, you may want to let the customer have primary responsibility for obtaining test subjects. The customer usually knows their own users best, and often knows how to get in touch with relevant test subjects.

It might be nice to give the test subjects a reward for participating. We usually give a gift card of 500, - to each participant. Then it will also be easier for us to ask people if they want to participate as a test subject.

The organization of the time of attendance for the various testers you may want to handle yourself. This makes it easier to set up a schedule that works well for everyone involved, especially the person who will conduct the interviews. It can be demanding to conduct several in-depth interviews one after the other, therefore it is important to have enough time for breaks between sessions.

It's also nice to have some time at the start of the day to prepare for the interviews before the first test person. As a rule, the schedule for the day looks like this:

  • 09:30 — 10:15 User test 1
  • 10:30 — 11:15 User Test 2
  • 12:12:45 PM User Test 3
  • 13:00:13:45 User Test 4
  • 14:00 — 14:45 User Test 5


It is advantageous to conduct all five interviews on the same day. Fast progress is one of the success criteria of this way of working. If interviews are spread over several days, it is also quick to lose some of your energy. In addition, it could become more demanding to coordinate whether someone should observe the interviews as well.

Easy User Test Lab Setup

A user test lab must be set up so that the test person feels comfortable and confident in the situation, and does not get a strong experience of being observed. At the same time, test subjects should be aware that everything they say is documented.

An optimal user test lab consists of two rooms. One small, pleasant meeting room for the interviews, and one larger room for the observers. In the small room there is a table, a couple of chairs, a PC and coffee and water. If the test is to be done on a smartphone, we set up a camera above the phone so that the observers can see what is happening on the screen.

The observer room must have a screen for live video viewing and large whiteboards to hang up notes along the way. There should be room for at least four people in the observer's room.

Bilde av rom med whiteboard klargjort for workshop
We use the process room as an observer space. Here, there is plenty of space for the observers to move around the room and hang up notes on the walls while the interviews are ongoing. Photo: Lindbak.

We use Google Meet or Whereby for streaming and recording the interviews. If the prototype is to be tested on desktop, we use the camera on the PC to film the test person — while sharing the screen. This is sent directly to a PC in the observer room which utilizes the built-in recording feature to document the interview.

When testing on a smartphone, we use the camera on the PC to film the test subject. In addition, we set up an external webcam that films the screen of the phone. This setup assumes that the test subject leaves the phone at rest on the table.

Note: there are many good solutions for user lab setup, but this setup is easy and cheap and it makes sense!

Make the test person confident in the role

Now everything should be ready for user testing. We have a prototype that looks like a real product, a schedule of five test people, and we have premises and equipment suitable for conducting the user tests.

Before the first interview, we draw up a matrix on the whiteboard of the observer room with the names of the test subjects horizontally, and the names of the screens on the prototype vertically. In addition, we use to have a row called General. Under General, statements come from the uses that are useful but not directly related to the prototype.

It is also recommending Miro to the matrix. Then all notes become digital right away, giving the opportunity for observers to participate from different locations.

As an interviewer, it is your job to create a safe framework and to ensure that the test subject has a say in everything he or she has on their heart. It's about creating a good relationship and the job starts already when the person comes in the door at the user lab.

Show that you are confident in your role and give your test subject full attention.

Before embarking on the actual interview, I show the test person the observer room. Then I'm sure the test person understands the premise of the user test and that there are other people who are going to listen to what is being said. It will be unfortunate if a test subject does not understand that she is being observed during the interview, and then discovers it afterwards.

It is the test person who tests us

After the introduction, I bring the test person into the small room. Here she is placed in front of the PC and given information about the cameras and footage (which is only for documentation for internal use). Then I quickly explain what the project is about, who designed the idea and the prototype and that the interview is not a test. On the contrary, it is the test person who tests us.

Although I have often been involved in prototyping, I usually say that I have nothing to do with the idea or prototype. I am solely responsible for conducting the user tests. Therefore, the test person need not be afraid that I will take myself close to negative feedback about the prototype.

In addition to attempting to remove the interviewee's inherent desire to be nice, thus failing to provide negative feedback, I want to create an attachment that puts the test subject and me “on team against the design group.” I find that it makes it easier for the test person to give honest feedback.

The last step before the actual user test is to explain the form of the test. I use an open — not task-based — form of in-depth interview. The purpose of the interview is primarily to check whether the basic idea, and the premises on which it is built, are relevant, not whether the interface looks pretty. That's why I ask the test person to speak freely about one screen at a time. The most important thing is that the test person thinks out loud and says everything that occurs to him or her. No information is immaterial or unnecessary. And I can promise you; it almost always turns up gold!

The Good Conversation

The user test itself opens with a couple of general questions about the problem that the prototype addresses. “What are you working on? Do you use online tools in your job?”

It's nice for the test person to start by talking about things they have close knowledge of, even if it's information we already have. These questions are also nice warm-up for me as an interviewer. We can talk freely together without anything getting in the way.

Then we shift the focus onto the screen. As a rule, the question that follows is:

What do you see here?

The conversation often starts off a bit technical:

Here I see a logo on the left. Here is a button that says “new task” on it. Here is a list with some points...

Does the wording in the list make sense to you?

“Yes, these were familiar words, but I don't like that you always have to use professional terms...

What don't you like about professional terms?

Then the talk is on. In about 45 minutes we go through all the screenshots of the prototype. If the test subject asks what something means, or what a button does, it is important to answer with: “What should it mean?” or “What would you expect that to happen if you click the button?”.

Skjermbilde av videomøte hvor det er kamera rettet mot testperson og et kamera rettet over en mobil for å få med alle vinkler og bevegelser
We use the process room as an observer space. Here, there is plenty of space for the observers to move around the room and hang up notes on the walls while the interviews are ongoing. Photo: Lindbak.

As the interview draws to a close it is advisable to have three minutes to ask if there is something the test subject has on his heart that he was not given room to say previously. “Is there anything about the prototype that you particularly liked? Something you missed?”

After the interview, we usually bring the test person to the observers so that they thank for their efforts and possibly ask questions directly. It happens that the observers have one or two things they wish the test person elaborated on a little more.

Observers note along the way

During the entire interview, 2—6 people sit in the observer room and watch the interview on screen. This is usually the project facilitator, designers, project owner and any others associated with the project. Their job is to note important statements of the test subject. Positive and neutral statements on green Post-it notes and negative statements on red patches. The patches are placed in the appropriate place in the matrix on the whiteboard.

The focus has shifted from designing something we think will hit a strong target audience, to solving the problems that test subjects have pointed out during interviews.

After five interviews, it often becomes clear what works and what doesn't work in the prototype. In addition, we are happy to have a clearer understanding of what we are really trying to create. The focus has shifted from designing something we think will hit a strong target audience, to solving the problems that test subjects have pointed out during interviews. The experience of having come miles ahead in the design process is almost always jubilantly present in the room.

Whiteboard med mange post-it-lapper i rosa og grønne farge
The notes of the observers are hung on the wall in a matrix. Red patches are negative feedback, green patches are positive or neutral. Photo: Increo.

The way forward after user testing

Finally, the contents of the post-it patching must be written down in suitable tool. We usually use Google Sheet or Miro (if not already filled in there). It makes the document easy to share with others afterwards. Based on the feedback, we write a summary for each screenshot in the prototype, as well as a general recommendation for further work.

What challenges were highlighted? Are there any obvious ways to solve them? Was there functionality we hadn't thought of? Were parts of the prototype unnecessary? Were our hypotheses about the problems and needs of users correct?

And last but not least, is the idea worth moving forward with?

What can we help you with?

Sebastian Krohn
Sebastian Krohn
Agency Manager, Consulting
Oslo
sebastian@increo.no
/
988 00 306
Morten M Wikstrøm
Morten M Wikstrøm
CEO, Consulting
Trondheim
morten@increo.no
/
976 90 017

See also:

Keep up to date with our newsletter