Skip to content Skip to navigation

Qualitative Website Usability Testing

Posted by: 
Tags: 

A couple of months back we'd posted to su_webmasters asking for advice / suggestions / ideas for conducting web usability tests. We're grateful for all the suggestions we received -- thank you for those!

We wanted to follow up with a quick recap of what we ended up doing, and how it went.

Goals
Conduct qualitative usability testing on our ECorner website (ecorner.stanford.edu) to gain insights on the general usability of the UI.

Establish a method to do this type of testing frequently for all our sites. In order to keep it simple and easy (in the hope that this wouldn't be the only time we do this), we knew we'd have to try to:

  1. Use a nearby location and rooms we have ready access to
  2. Keep costs low -- use equipment we already own if possible, find inexpensive software options
  3. Keep the process simple -- keep the setup simple and minimal, limit the time commitment needed from the team

Equipment / Software
The key elements in our setup were:

In addition, we used:

  • Silverback (thanks to Vijoy for the suggestion) for capture - it records the screen, and also uses the Mac iSight camera to capture the participant's face. It exports a nice picture-in-picture version so that both the screen and the participant's face can be seen, I have heard Morae is a similar app available for PC.
  • An Apple remote (necessary to make chapter and highlight marks in Silverback - without it, the session can still be recorded but marks can't be made)

Locations
The test room and observation room were not adjacent to one another, and didn't need to be since we didn't need to run any cables. An added benefit of non-adjacent rooms was that the observers didn't have to worry about being overheard by the participant. We used two conference rooms in our building near our offices:

Test room setup:

  • iMac with paired remote
  • Two chairs + table, minimal distractions
  • Skype (account 1) set to screen sharing mode; Skype window hidden
  • ECorner website open to home page
  • Silverback set up for the next participant

Observation room setup:

  • MacBook Pro connected to projector and a small speaker, so that others could see and hear what was coming through on Skype
  • Table, chairs, and power outlets for observers
  • Skype (account 2), connected to account 1, set to mute + no video (to prevent any distractions on the participant's side)

Process
We used Steve Krug's Rocketsurgery Made Easy as our guide, and used modified versions of his sample docs for the test session script, observer guidelines, and participant recording release. [http://www.sensible.com/rocketsurgery/]

An afternoon (1 - 5pm) was scheduled and the conference rooms reserved. Each session would be 45 - 50 minutes long, with an hour at the end for observers to debrief.

Three participants were recruited. We found our participants via a student group that agreed to post a notice requesting testers, as well as by identifying willing students in our building. We specified up-front that they should have no prior experience with the site we're testing, and that there would be a small honorarium offered to show our appreciation for their time and attention. We did have one participant cancel on the morning of the tests, but were able to find a replacement quickly.

Observers were seated in the observation room prior to the arrival of the first participant. Upon arrival, the participant was greeted and led to the testing room. Using a script, the facilitator led the participant through steps to explore the ECorner website, and asked her / him to provide verbal feedback at every stage. The participant was also asked to complete specific tasks on ECorner. Finally, a similar website (AcademicEarth.org) was shown to the participant in order to compare its features to those of ECorner. At the end of the session the participant was thanked, given their honorarium, and escorted out.

Observers were able to see the participant's screen and hear audio of the session (via Skype -- Silverback simply captures in the background, and doesn't provide any live observation functionality). All observers took part in the post-session debrief meeting.

Results
While some of the UI issues our participants encountered were things we were already aware of, it was particularly interesting to see which ones kept coming up. After watching three sessions, it was clear to the team which three were the most critical, and should be fixed first. For those interested, on ecorner.stanford.edu these were:

  • The video browser section on the home page is difficult to use, and also prevents the user from exploring the categories in the main navigation.
  • The audio player looks too much like a video player and confuses users viewing any podcast page.
  • The listing of full-length videos, clips, and podcasts at the right of the player on any video page is confusing and doesn’t clearly convey the relationship of these materials to one another.

Lessons learned

  • It's good to have a backup participant in case someone cancels or doesn't show.
  • It's a significant challenge to avoid leading the participant too much during the session. We hope to improve in this area -- ask more open-ended questions, give the participant more time to explore without telling them what to do next, etc.
  • If you're tempted to run more than three sessions, be advised that even for the three we ran, by the end our facilitator was getting tired, and the observers started to get distracted.
  • The export process from Silverback takes time -- best to set it up overnight if you're exporting whole sessions. The resulting QuickTime file is nice to have for review, though.

We feel we now have an easy way to conduct these tests, and are planning our next session for early October. We hope to conduct this type of testing for all our other websites, as well.