- usability testing will resolve arguments and reveal that what a team was arguing about was not that important
- awareness of usability is key
- focus groups are not usability tests
- Krug believes that usability tests are better than focus groups
- focus groups: a small group of people sit around a table and talk about things; products they like or don't like, past experiences, or reactions
- usability tests: one person uses something (a site, product, etc.) and goes through typical tasks so you can figure out what will frustrate or confuse people
- usability tests are different because you watch someone actually use something
- focus groups are nice for deterring abstract things, but not the specifics like how your site works and how to improve it
- focus groups are better in the planning stages while usability tests should be used throughout the whole process
- doing usability testing = having a great site
- reminds you that not everyone thinks like you do or knows what you do
- it gives you a fresh perspective on things
- helps you realize that you take things for granted that might not be obvious to someone else
- testing at least one user is better than none
- even the worst test will teach you something
- they're fairly easy to do and will improve your site somehow
- testing one user in the beginning of the process is better than testing a lot of people at the end
- testing early and often helps you get the most out of it
- it's harder to make changes to a site once it's in use
- correcting mistakes early will help you later
- usability testing does not have to be elaborate or expensive
- Krug explains how to DIY...
- how often should you test?
- Krug advises one morning a month, because...
- it keeps it simple, you get the information you need to move forward, you don't have to figure out when to test, and it makes it easier for team members to attend
- how many users do you need?
- Krug advises 3, and this is enough, because...
- you aren't trying to prove anything, so you don't need quantitative testing -- usability tests are qualitative
- you want to improve something
- you don't need to find every problem, especially since you'll be doing more testing next month
- how do you choose participants?
- Krug thinks you should recruit loosely and grade on a curve, which means...
- finding testers within your target audience isn't super important -- this requires more work and probably more money
- don't get hung up on finding users that reflect your audience
- if needed, recruit some people with specific knowledge
- you should recruit people who aren't in your target audience because:
- everyone should be able to use it, not just your target audience
- we're all beginners on the inside
- everyone appreciates clarity
- how do you find your participants?
- you can recruit through user groups, Craigslist, Facebook, Twitter, asking friends, etc.
- incentives for "average" users range from $50-100
- where should you test?
- a quite space with no interruptions, a computer with Internet access, a mouse, a keyboard, and a microphone
- use screen sharing software along with screen recording software
- who should do the testing?
- anyone can facilitate; it just takes practice
- encourage the participants to think out loud and be honest
- who should observe?
- as many people as possible, according to Krug
- it's usually a transformative experience for everyone observing
- try to get everyone involved: team members, stakeholders, managers, etc.
- you'll need an observation room
- observers should write down what they see and bring their thoughts to the debriefing
- what do you test? when?
- as early as possible -- there is no such thing as too early
- before you begin designing your site, you should do a test of competitive sites
- you'll learn things before even building anything
- how do you choose what tasks to test?
- this will depend on what you have available
- example tasks could be: creating an account, logging in using an existing name and password, retrieving a forgotten password, retrieving a forgotten username, changing the answer to a security question, etc.
- tasks should be worded carefully and should include any information they'll need
- you should try to let participants choose some aspects of the task themselves -- it gives them a more personal investment in it
- what happens during a test?
- Krug has a script available here (with carefully chosen wording)
- should always include:
- a welcome (explain what's about to happen), the questions (ask the participant about themselves), the home page tour (ask the participant to look around and tell you what they think), the tasks (watch the participant perform tasks -- make sure they stay focused and honest and let them work on their own), probing (ask them questions about anything that happened during the test), and wrapping up (thank them for their help, pay them, and show them out)
- Krug features a helpful annotated test session that he performed on pages 127-136
- he uses the script linked above
- typical problems you'll face
- users are unclear on the concept
- what they're looking for isn't there -- perhaps you didn't use the words they would typically look for
- there's too much going on -- you may need to reduce the noise on the page
- debriefing and how to decide what to fix
- debrief ASAP so ideas are still fresh in everyone's minds
- fix the most serious problems first -- this means:
- making a collective list -- ask everyone what they thought the 3 most serious problems were
- choose the ten most serious problems
- rate those problems
- create an ordered list
- what to fix and what to not fix
- keep a separate list of problems that are not serious and are easy to fix
- resist the impulse to add things like explanations or instructions -- often the solution is to take something away
- take "new feature" requests with a grain of salt -- be suspicious of them
- participants are not designers
- ignore "kayak" problems
- AKA when users momentarily get confused but fix it nearly instantly -- their second guess is good enough
- remote testing: users complete the test from their home or office, making it easier to recruit more types of people
- unmoderated remote testing: e.g. usertesting.com, which lets people record themselves performing a usability test -- this requires almost no effort on your part
- there's essentially no reason not to (p. 141)