A couple of months back we'd posted to su_webmasters asking for advice / suggestions / ideas for conducting web usability tests. We're grateful for all the suggestions we received -- thank you for those!
We wanted to follow up with a quick recap of what we ended up doing, and how it went.
Conduct qualitative usability testing on our ECorner website (ecorner.stanford.edu) to gain insights on the general usability of the UI.
Establish a method to do this type of testing frequently for all our sites. In order to keep it simple and easy (in the hope that this wouldn't be the only time we do this), we knew we'd have to try to:
Equipment / Software
The key elements in our setup were:
In addition, we used:
The test room and observation room were not adjacent to one another, and didn't need to be since we didn't need to run any cables. An added benefit of non-adjacent rooms was that the observers didn't have to worry about being overheard by the participant. We used two conference rooms in our building near our offices:
Test room setup:
Observation room setup:
We used Steve Krug's Rocketsurgery Made Easy as our guide, and used modified versions of his sample docs for the test session script, observer guidelines, and participant recording release. [http://www.sensible.com/rocketsurgery/]
An afternoon (1 - 5pm) was scheduled and the conference rooms reserved. Each session would be 45 - 50 minutes long, with an hour at the end for observers to debrief.
Three participants were recruited. We found our participants via a student group that agreed to post a notice requesting testers, as well as by identifying willing students in our building. We specified up-front that they should have no prior experience with the site we're testing, and that there would be a small honorarium offered to show our appreciation for their time and attention. We did have one participant cancel on the morning of the tests, but were able to find a replacement quickly.
Observers were seated in the observation room prior to the arrival of the first participant. Upon arrival, the participant was greeted and led to the testing room. Using a script, the facilitator led the participant through steps to explore the ECorner website, and asked her / him to provide verbal feedback at every stage. The participant was also asked to complete specific tasks on ECorner. Finally, a similar website (AcademicEarth.org) was shown to the participant in order to compare its features to those of ECorner. At the end of the session the participant was thanked, given their honorarium, and escorted out.
Observers were able to see the participant's screen and hear audio of the session (via Skype -- Silverback simply captures in the background, and doesn't provide any live observation functionality). All observers took part in the post-session debrief meeting.
While some of the UI issues our participants encountered were things we were already aware of, it was particularly interesting to see which ones kept coming up. After watching three sessions, it was clear to the team which three were the most critical, and should be fixed first. For those interested, on ecorner.stanford.edu these were:
We feel we now have an easy way to conduct these tests, and are planning our next session for early October. We hope to conduct this type of testing for all our other websites, as well.