Usability, Customer Experience & Statistics

Can users self-report usability problems?

Jeff Sauro • October 6, 2010

Usability doesn't have to be expensive, time consuming or involve lots of users.

Jakob Nielsen popularized this discount approach two decades ago. A focus on finding and fixing problems by testing early and often with small-samples generates major insights.

More recently Steve Krug has taken this informal approach to the masses by encouraging website owners to spend a few minutes a month watching users(or your neighbor) attempt tasks.

Nielsen and Krug make a convincing case that a little effort put on usability can result in major benefits. In fact, they have made such a compelling case, you might wonder if we can eliminate the UX middleman altogether. Can users just report the problems themselves?

Have the users do the testing

Having users self-report problems is different than having users complete usability tests remotely. In a typical unmoderated remote test using a popular service like, users are asked to attempt tasks and provide feedback.

As efficient as remote unmoderated test are, someone still needs to review the recorded sessions, sift through user comments, document the usability problems, and come up with solutions. The process is still time consuming.

Having users self-report usability problems involves asking users to complete the same tasks as they would in a lab-based or remote unmoderated test.  The difference is that users document the problems, where they encountered the problems, and assign severity ratings.  Problem recording mechanisms can be as low-tech as a Word document or as high-tech as a real-time web-based reporting system.

Users Report Half the Problems Experts Do

A few researchers have examined the effectiveness of self-reported usability problems. It turns out users do reasonably well.  On average, users report about half the problems trained usability professionals find while watching users in a lab (Castillo 1998 et al; Bruun et al 2007; Andreasen et al 2009). This applies to both critical and severe problems. This method appears to work well on both software (e.g Mozilla Thunderbird) and websites (e.g.

Users Find Problems Experts Don't

Not only are users reasonably good at finding the same problems as seasoned professions, they also find problems experts don't. This is especially likely when looking for learnability issues where users tend to have a different mental model of the system than experts.  

Users Don't Seem to Mind

In the analysis by Castillo, users didn't seem to mind spending time describing and reporting the problems. Even users who gave the longest and most detailed reports said they felt that reporting problems didn't interfere much with performing the tasks. They also didn't feel like the reports needed to be anonymous.

Some Tips on Self-Reporting Problems

Here are some tips on having users self report problems.
  1. Your website or software needs to be functioning reasonably well and accessible remotely.
  2. Allow users to report anonymously (even though most probably won't mind being identified)
  3. Provide some guidelines or training on how to report the problems.  A short document, presentation or video would suffice.
  4. Compensate your users like you would for any remote usability test.
  5. Have users report some combination of the following
  • Description of the usability problem
  • Severity of the Problem (1 = Trivial to 5=Critical)
  • Where the user encountered the problem (URL or screen)
  • What the user was doing (which task)
  • Expectation about what should have happened
  • Could the user recover? If so how?
  • Possible design/programming solutions to the problem

Self-Reporting is no Panacea

Having users report usability problems can be another effective tool for gathering low cost feedback. For example, heuristic evaluations work best when done with multiple evaluators. This can be hard if you're the only UX person on a product (or company). Having users self-report problems in addition to a heuristic evaluation can be an effective strategy.

Self-reporting problems isn't likely to replace the need for usability professionals anytime soon. This approach can't replace traditional user testing or heuristic evaluations: users still miss about half the problems, especially in more complex systems (although, there are usually more problems than development teams can get to).

So if you were looking for an excuse to get rid of us pesky UX professionals, sorry, it appears our jobs are safe for now!

More Reading

  • Castillo, J. C., Hartson, H. R. and Hix, D. (1998) Remote usability evaluation: Can users report their own critical incidents? Proceedings of CHI 1998, ACM Press, 253-254 (See also Virginia Tech Remote Usability Evaluation)
  • Fu, L., Salvendy, G., & Turley, L. (1998). Who Finds What in Usability Evaluation. In Proceedings of the Human Factors and Ergonomics Society 42nd Annual Meeting (pp. 1341-1345).
  • Petrie, H., Hamilton, F., King, N. and Pavan, P. (2006) Remote usability evaluation with disabled people. Proceedings of CHI 2006, ACM Press, 1133-1141
  • Bruun, A., Gull, P., Hofmeister, L., and Stage, J. (2009). Let your users do the testing: a comparison of three remote asynchronous usability testing methods. CHI 2009, ACM Press pp1619-1628
  • Andreasen, M., Nielsen, H., Schrøder, S., Stage, J.(2007) What happened to remote usability testing?: an empirical study of three methods[pdf], Proceedings of CHI 2007, ACM Press

About Jeff Sauro

Jeff Sauro is the founding principal of MeasuringU, a company providing statistics and usability consulting to Fortune 1000 companies.
He is the author of over 20 journal articles and 5 books on statistics and the user-experience.
More about Jeff...

Learn More

You Might Also Be Interested In:

Related Topics

Usability Problems

Posted Comments

There are 2 Comments

October 29, 2010 | Jeff Sauro wrote:


That's a great point. All metrics come with some bias or problem and it would be important to understand in more detail how such problems are different, especially if one is not able to corroborate this data with other sources.

There is more detail in a couple of the references, for example on the severity and type of issues but certainly it's a topic worth investigating more. 

October 28, 2010 | Jessica Kerr wrote:

As always, an interesting post and good ideas for ways to get the most information we can in different situations.

One aspect of the research concerns me though. Castillo et al's users apparently didn't feel the process of reporting significantly influenced their ability to do the task.

However, not _feeling_ influenced is, as you well know, very different from not actually _being_ influenced.

As user experience professionals we're always reinforcing how people's reports of their behaviour are often very different from their actual behaviour. So it seems there's still a big question here about whether being tasked with reporting issues changes the user's behaviour and if so, is it in any consistent direction or manner?


Post a Comment


Your Name:

Your Email Address:


To prevent comment spam, please answer the following :
What is 5 + 4: (enter the number)

Newsletter Sign Up

Receive bi-weekly updates.
[6634 Subscribers]

Connect With Us

Our Supporters

Loop11 Online Usabilty Testing

Userzoom: Unmoderated Usability Testing, Tools and Analysis

Use Card Sorting to improve your IA


Jeff's Books

Customer Analytics for DummiesCustomer Analytics for Dummies

A guidebook for measuring the customer experience

Buy on Amazon

Quantifying the User Experience 2nd Ed.: Practical Statistics for User ResearchQuantifying the User Experience 2nd Ed.: Practical Statistics for User Research

The most comprehensive statistical resource for UX Professionals

Buy on Amazon

Excel & R Companion to the 2nd Ed. of Quantifying the User ExperienceExcel & R Companion to the 2nd Ed. of Quantifying the User Experience

Detailed Steps to Solve over 100 Examples and Exercises in the Excel Calculator and R

Buy on Amazon | Download

A Practical Guide to the System Usability ScaleA Practical Guide to the System Usability Scale

Background, Benchmarks & Best Practices for the most popular usability questionnaire

Buy on Amazon | Download

A Practical Guide to Measuring UsabilityA Practical Guide to Measuring Usability

72 Answers to the Most Common Questions about Quantifying the Usability of Websites and Software

Buy on Amazon | Download