There is nothing more powerful than sitting a user down in front of an interface and asking them to try and achieve a task. This is why usability testing (usually conducted on five to ten users) is an essential technique for user-centred design.

If you’ve ever conducted usability tests you’ll know that the easy part is watching the users. They practically tell you what needs to change with your interface as they ‘think aloud’ while completing tasks. The hard part is remembering all the little gems you observe during the test sessions! Once all the test sessions are over, you need to work out which issues are the most important (or affect the most users), and then prioritise the issues worth focusing attention on. This can be a challenge when you’ve just sat through a full day of testing.

To make the most out of our usability test sessions we use a method we like to call ‘the rapid debrief’.

The traditional approach

In order to know why the rapid debrief works well, let’s first look at what the conventional method involves. Usually there is a clear distinction between the testing, debriefing and analysis phases of usability testing.

DiUS Usability Testing: Traditional vs Rapid Debrief

The test sessions are often scheduled back-to-back and it’s only once all test sessions are completed that the facilitator, note taker and observers will gather together for a debriefing session. Observers will then offer up the usability issues they noticed and the facilitator creates a list which is then prioritised.

Some drawbacks of this approach are:

  • some observers may not have noticed an issue or may believe it didn’t occur;
  • there may be disagreement about how regularly an issue occurred;
  • there’s a greater need to review recordings because of the time difference between the test and the debrief;
  • if the sessions weren’t recorded (or if the recording didn’t work) then details may have been forgotten because of the time-lag between viewing the session and the debrief.
  • observers can get fatigued (and bored!) after multiple test sessions.

Using the Rapid Debrief

In the rapid debrief technique, short breaks (debriefs) are scheduled between tests. During this debrief the facilitator, note taker and observers gather and discuss issues they’ve noticed during the test they’ve just observed. It is faster and easier to discuss the findings immediately after the test because it is fresh in everyone’s mind. Less time is spent trying to recall issues and review footage. Sometimes by discussing an issue, you may quickly discuss a possible solution and draw up a quick wireframe there and then.

Where possible, we like to use a whiteboard or a big sheet of paper. It probably goes without saying that if the whiteboard is in the same room that the tests are being conducted, make sure to cover it up before bringing in the next user.

As the testing continues the list grows, but also some issues pop up again and again. Anything that has been seen before gets a notch added to it. This way, by the end of testing we have a list of issues and a tally of how frequently they occur.

The advantages of this approach are that it:

  • creates an in-depth list of issues that has been discussed and vetted as testing progresses;
  • gives a tally of the prevalence of issues;
  • is quicker than the traditional method;
  • quickly brings observers up to speed even if they haven’t seen previous tests;
  • improves stakeholder buy-in (when stakeholders have been observing); and
  • fights boredom by giving observers an ever expanding list of things to keep on the look out for, as well as the challenge to look for new things to add to the list.

We’ve had a lot of positive feedback from our clients on the rapid debrief approach.  When looking back at the usability testing on one of our recent projects, Emma Buckland, Lead User Experience designer, ME Bank Digital Team, said the rapid debrief approach “Keeps it rich and interactive – both in and out of the session. A very fast and memorable way of understanding and collating the findings”.

Next time you do usability testing try out the rapid debrief and see if you agree.