Usability, Customer Experience & Statistics

5 Steps to Conducting an Effective Expert Review

Jeff Sauro • February 9, 2016

Expert reviews aren't' a substitute for usability testing and don't provide metrics for benchmarking

But they are an effective and relatively inexpensive way to uncover the more obvious pain points in the user experience.

Expert reviews are best used when you can't conduct a usability test or in conjunction with insights collected from observing even just a handful of users attempting realistic tasks on a website or application.
The following five steps for conducting an effective expert review aren't going to make you an "expert" in interface evaluation immediately, but if you apply them with enough practice eventually they might!

1. Understand the Method & Human Behavior

An expert review is not just your opinion of likes and dislikes. While you do need to use your judgment, that judgment should be guided by principles of how humans interact with computers—which will ideally be backed by research.

Here are a few suggestions to start reading up on expert reviews:
  • Distinguish between Heuristic Evaluations and expert reviews. A good place to start your research is with one of the seminal papers on Heuristic Evaluations (HE). People often call an expert review a Heuristic Evaluation, but a Heuristic Evaluation is a special type of expert review that's guided by general principles.

  • Learn the strengths and weaknesses of expert reviews. This good paper[pdf] provides background and context to a lot of literature and varieties of expert reviews (also called inspection methods).

  • Acquaint yourself with Molich & Nielsen's 10 Heuristics. Even if you're not planning on conducting a strict Heuristic Evaluation, you should be familiar with the most commonly cited Heuristics.
Now for the hard part. The most effective evaluators are those that know a lot about interface design principles, common usability pitfalls, AND the domain of the product, app, or website.

It's sort of like building your vocabulary. The best way is to read more books AND supplement that with some memorizing. The same applies to expert reviews: the more users you observe trying to use websites and software, the better you're likely to anticipate problems. But you can supplement this experience (which takes longer) and read as much as you can on design principles.

2. Have Some Idea of Common Tasks Users Will Perform

We've found that focusing an expert review on the top tasks a user might likely perform will help improve the number and relevance of problems uncovered. It may also reduce the chances the problems you find are false positives.

A common way to apply the expert review is to use a modified cognitive walkthrough. Decompose each task as the user would attempt it, and try and think like the users. This means you should have some data on both the type of tasks and the type of domain knowledge (e.g. terms) the users are likely to know when they're using the interface.

3. Conduct the Review Methodically and Independently

Conduct the review. You can go low tech or high tech. Use a Word document, spreadsheet, PowerPoint deck, paper, or web form to record your observations. Think globally and locally about the experience: look for issues that span multiple screens (like navigation) and issues that are more idiosyncratic (like content or actions on a specific page). Record screen shots with clearly articulated examples of what you've identified is a problem, its possible impact, and suggestions for improvement (when appropriate).

For example, Yahoo! Mail recently moved its delete button and put an archive button in its place. This is likely to result in users accidentally archiving email instead of deleting it. Judging from some online comments, it's a problem many people are having.

As another example, a form used to submit information on a government website has the Back button located on the right and the Submit button on the left. Both have equal prominence; it's likely many users will click the wrong button (the Back button instead of the Submit button and vice versa) and have to re-enter their data, or will incorrectly submit their data. The prominence and placement of the Back button needs to better conform to conventions and expected locations.


4. Have Another Expert Perform an Independent Review

The best expert reviews are those that involve multiple evaluators working independently. Have at least one other person conduct the review; ideally 3-5 is a good number. Even less experienced evaluators can help provide a fresh perspective to more seasoned experts. This redundancy helps find more issues AND minimizes the perception that this exercise is only one person's biased opinion.

5. Categorize, Reconcile Differences, and Add Severity

Aggregate your results and report on which issues multiple evaluators identified. This will help corroborate the findings (a measure of validity). Reconcile the problems that only some (or one) of the evaluators found and see whether these are unique problems or just different instances of another problem already identified. Expect to find a lot of unique problems, but remember that just because only one evaluator identified it, doesn't mean it's not a legitimate problem.

Consider using some form of simple severity rating (e.g. minor, moderate, severe) or find a way to prioritize the issue list if there are a lot of problems uncovered.

Oh, and it's okay to point out things that are working well! While it's easy to get focused on what to fix—after all, that's usually the reason for the review—point out the positives when you see them!

About Jeff Sauro

Jeff Sauro is the founding principal of MeasuringU, a company providing statistics and usability consulting to Fortune 1000 companies.
He is the author of over 20 journal articles and 5 books on statistics and the user-experience.
More about Jeff...

Learn More

UX Bootcamps: Rome: June 20-22, 2016 and Denver: Aug 17-21, 2016
Quantifying the User Experience Half Day Seminar: London: June 15th, 2016 and Chicago: July 15th, 2016

You Might Also Be Interested In:

Related Topics

Expert Review, Heuristic Evaluation

Posted Comments

Newsletter Sign Up

Receive bi-weekly updates.
[5951 Subscribers]

Connect With Us

UX Bootcamp

Rome:June 20-22 & Denver: Aug. 17-19 2016

3 Days of Hands-On Training on UX Methods, Metrics and Analysis.Learn More

Our Supporters

Userzoom: Unmoderated Usability Testing, Tools and Analysis

Loop11 Online Usabilty Testing

Use Card Sorting to improve your IA


Jeff's Books

Customer Analytics for DummiesCustomer Analytics for Dummies

A guidebook for measuring the customer experience

Buy on Amazon

Quantifying the User Experience: Practical Statistics for User ResearchQuantifying the User Experience: Practical Statistics for User Research

The most comprehensive statistical resource for UX Professionals

Buy on Amazon

Excel & R Companion to Quantifying the User ExperienceExcel & R Companion to Quantifying the User Experience

Detailed Steps to Solve over 100 Examples and Exercises in the Excel Calculator and R

Buy on Amazon | Download

A Practical Guide to the System Usability ScaleA Practical Guide to the System Usability Scale

Background, Benchmarks & Best Practices for the most popular usability questionnaire

Buy on Amazon | Download

A Practical Guide to Measuring UsabilityA Practical Guide to Measuring Usability

72 Answers to the Most Common Questions about Quantifying the Usability of Websites and Software

Buy on Amazon | Download