Usability, Customer Experience & Statistics

6 things you didn't know about Heuristic Evaluations

Jeff Sauro • August 31, 2010

A Heuristic evaluation is a process where someone trained in usability principles reviews an application (a website or software). She compares the website against a set of guidelines or principles ("Heuristics") that tend to make for more usable applications.

For example, if while completing a task a user gets a message that says "Error 1000xz Contact System Administrator" this would violate a Heuristic: "Error messages should be expressed in plain language (no codes)."  

Maybe you already have heard-of and use Heuristics Evaluations. Here are six things you might NOT know about this popular usability method:

  1. An effective Heuristic Evaluation needs to have multiple evaluators (3 to 5) not just one because different evaluators tend to find different problems.

  2. Heuristics Evaluations should be done prior to and in addition to user-testing, not instead of user-testing. If you have easy access to a few experts then you can identify a lot of low-hanging problems prior to subjecting them to users. What's more, you can then check your expert effectiveness by seeing how many HE problems are encountered by users during the user-test.

  3. Rolf Molich, one of the co-creators of the method said "Heuristic Evaluations are 99% bad" at UPA 2009 on a panel with Jakob Nielsen and Chauncey Wilson on Heuristic Evaluations. His reaction was in large part because HE is being used instead of user-testing and only loosely based on any heuristics.

  4. Double experts are better than single experts: Being an expert in both usability principles and the domain[pdf] make for better, more focused heuristic evaluations. So if you're testing a financial application you'd have better results by having a usability expert either with an accounting background or well versed in the lexicon and flow of accounting and finance software.

  5. There is not a single set of Heuristics. Nielsen's 10 famous heuristics are the ones most used but there are also others:
    1. Bastien and Scapin created a set of 18 Ergonomic criteria [paid link]
    2. Gerhardt-Powals 10 Cognitive Engineering Principles
    3. Connell & Hammond's 30 Usability Principles [inverted pdf]
    4. Smith & Mosier's 944 guidelines for the design of user-interfaces (from 1986)

  6. There is controversy over just how effective Heuristic Evaluations are in finding problems as compared to user-testing. There is even controversy over how you determine the effectiveness of a usability method. This was one of the key criticisms in Gray and Salzman's 1998 influential paper Damaged Merchandise: A simple count of problems isn't a reliable or valid measure of effectiveness.

About Jeff Sauro

Jeff Sauro is the founding principal of MeasuringU, a company providing statistics and usability consulting to Fortune 1000 companies.
He is the author of over 20 journal articles and 5 books on statistics and the user-experience.
More about Jeff...

Learn More

You Might Also Be Interested In:

Related Topics

Heuristic Evaluation

Posted Comments

There are 4 Comments

May 11, 2016 | docihsc wrote:


March 30, 2011 | Keith wrote:

Multiple Evaluators! This has me re-thinking a lot. Currently we have one Usability Specialist do an evaluation prior to User Testing and making conversion optimization recommendations, however to use several would be better. Thank you! 

September 1, 2010 | Jeff Sauro wrote:

Ash, Thanks for the reference to ISO 9241-10:1996. 

August 31, 2010 | Ash Donaldson wrote:

A rather influential set of heuristics missing from your list is the International Standard, ISO 9241-110:2006 Ergonomics of human-system interaction: Dialogue Principles (an update of ISO 9241-10:1996) 

Post a Comment


Your Name:

Your Email Address:


To prevent comment spam, please answer the following :
What is 3 + 2: (enter the number)

Newsletter Sign Up

Receive bi-weekly updates.
[6634 Subscribers]

Connect With Us

Our Supporters

Loop11 Online Usabilty Testing

Userzoom: Unmoderated Usability Testing, Tools and Analysis

Use Card Sorting to improve your IA


Jeff's Books

Customer Analytics for DummiesCustomer Analytics for Dummies

A guidebook for measuring the customer experience

Buy on Amazon

Quantifying the User Experience 2nd Ed.: Practical Statistics for User ResearchQuantifying the User Experience 2nd Ed.: Practical Statistics for User Research

The most comprehensive statistical resource for UX Professionals

Buy on Amazon

Excel & R Companion to the 2nd Ed. of Quantifying the User ExperienceExcel & R Companion to the 2nd Ed. of Quantifying the User Experience

Detailed Steps to Solve over 100 Examples and Exercises in the Excel Calculator and R

Buy on Amazon | Download

A Practical Guide to the System Usability ScaleA Practical Guide to the System Usability Scale

Background, Benchmarks & Best Practices for the most popular usability questionnaire

Buy on Amazon | Download

A Practical Guide to Measuring UsabilityA Practical Guide to Measuring Usability

72 Answers to the Most Common Questions about Quantifying the Usability of Websites and Software

Buy on Amazon | Download