Top of this document
Go directly to navigation
Go directly to page content

1 2

Usability Testers Manual

A guide for usability testing on a budget

Usability is a very important aspect in the development of interactive Mediamatic products. Unfortunately, there isn't a big budget available to do top-notch usability testing. To help you along with getting set up, this short manual was written.

Usability is a very important aspect in the development of interactive Mediamatic products. Unfortunately, there isn't a big budget available to do top-notch usability testing. To help you along with getting set up, this short manual was written.

Recruitment

To test your designs, you'll need people. Here are some different methods for finding participants.

  • Post an advert on the Mediamatic website. There are always users on the website that are interested. Additionaly, have the advert posted shared on the official Facebook page and Twitter channel. That way you can spread your chances across the Mediamatic social network.
  • Ask people you know. Friends, colleagues, aquaintances. They are a resource that's just sitting there under your nose, waiting to be used.
  • Ask visitors of exhibitions at Mediamatic Bank. They might have some time to do a small user test.

Once you have your users, its time to move on the preparing the testing environment.

Setting up the testing environment

You need very little resources to set up a good testing environment. Make sure to reserve one of the meeting rooms so you can test in a quiet environment without a lot of distractions. Furthermore, here's a list of things you'll need.

  • One Macbook with Quicktime Pro, or something else that records the screen as well as audio
  • Skype, for screen sharing with any colleagues observing from another room
  • A USB-mouse, for ease when people don't like using the trackpad
  • Seperate paper slips for every scenario
  • A notepad for keeping notes during user tests
  • Something to drink (and to eat)

When starting up the Macbook, make sure you've got Quicktime running in the background so you don't have to open it before you begin testing. Make sure that any files that are needed during tested are conveniently located in a map on the desktop, so users don't get too much distracted with their task.

If a colleague is available, it is recommended to have them observe the session via screensharing on Skype. That way anything that slips past you while making notes can still be observed by your colleague, increasing the chance that you'll find all issues.

Start up Skype and make sure that the screensharing function is already established. Do this before the participant arrives.

Also make sure to check everything is there and is functional. A checklist is included in the offline version of this manual. You can find it on the fileserver under 3.)Anymeta/Interaction design/Common aanpassingen 2011. If you plan on using a different test setting, make sure to make your own checklist that works with it!

Testing

When your participant arrives and you've introduced yourself, let them take a seat behind the computer. Explain to them what you are going to test in this session and tell them that any errors that occur isn't their fault, but the systems.

When you give the participant a scenario, just tell them what it says, to clarify what you would like them to do. Don't use words that point them to the results, just clarify the task.

Take your time to make notes. Ask questions if you see something interesting happen. Make your participants think aloud. You can ask them the following questions to get the most information:

  • What did you expect to happen when you did that?
  • What do you think will happen when you do that?
  • What are you looking for?
  • What are you trying to do?

If you don't get the time to ask everything during the session; ask afterwards. This way you can get as much explanation for their actions as possible.

After testing, thank them and give them a small present for their efforts. After they have left, transfer the screen recording to the file server and discuss the observations with your fellow observer(s).

Results

The best way to digitize your results is by putting them in a spreadsheet; divide the results by person and task for each scenario. If you have time, watcht he recordings again; you might have missed something that is very difficult to spot.
After you have made a list of every problem each user experienced, count fo each problem how many users experienced it. That way you can prioritize which issues should be solved right away.

Coming up with solutions

Coming up with solutions can be the hardest part of the usability cycle. It requires a lot of thought in how people might respond to what you have come up with. Don't hesitate to make rough sketches or clickable prototypes to quickly test with other people. And don't forget: there are multiple approaches to each problem. Make sure to cover as much of these angles as you can so you can quickly determine which way to go during the design meet.

Comments (2)

UX methods

here you will find a list of methods to evaluate user experience (somewhat related): http://www.allaboutux.org/

29 Jun 2011,16:32

An interesting read

A well written and concise article about UX testing. I will probably refer to this article in the near future.

,
25 Jan 2012,16:37
Comments (2)