Usability, Customer Experience & Statistics

Seven Tips for Writing Usability Task Scenarios

Jeff Sauro • April 16, 2013

The core idea behind usability testing is having real people trying to accomplish real tasks on software, websites, cell phones or hardware. 

Identifying what users are trying to do is a key first step. Once you know what tasks you want to test, you'll want to create realistic task scenarios for participants to attempt.

A task is made up of the steps a user has to perform to accomplish a goal. A task-scenario describes what the test user is trying to achieve by providing some context and the necessary details to accomplish the goal.

Crafting task scenarios is a balance between providing just enough information so users aren't guessing as to what they're supposed to do and not too much information so you can simulate the discovery and nonlinearity of real world application usage.
  1. Be specific: Give participants a reason or purpose for performing the task. Instead of giving generalities like "find a new kitchen appliance" ask them to find a blender for under $75 that has high customer ratings.

    While users might start searching with general ideas of what they want, they will quickly narrow their selection based on the usual suspects of price, indicators of quality and recommendations.  In the artificial world of usability testing, users will often encounter problems if you are too vague, and they will look to a moderator (if there is one) as to what they want them to find.  Don't be so vague in your task that users have to guess what you want them to do. For example, "You need to rent a mid-sized car on July 21st at 10am and return it on July 23rd at noon from Boston's Logan Airport."

  2. Don't tell the user where to click and what to do: While providing specific details is important, don't walk the users through every step. Leading a user too much will provide biased and less useful results. For example, instead of saying "Click on the small check box at the bottom of the screen to add GPS," just say "Add GPS to your rental car."

  3. Use the user's language and not the company's language: It's a common mistake to mirror the internal structure of a company on a website's navigation. It's also bad practice to ask participants to do things based on internal company jargon or terms. If users don't use the terms used in a scenario, it can lead to false positive test results or outright confusion.  Do users really use the term "asset" when referring to their kids' college funds?  Will a user know what a product "configurator" is or an "item-page" or even the "mega menu?"

  4. Have a correct solution: If you ask a user to find a rental car location nearest to a hotel address, there should be a correct choice. This makes the task more straightforward for the user and allows you to more easily know if a task was or wasn't successfully completed. The problem with "Find a product that's right for you" task is that participants are in the state of mind of finding information to solve problems. At the time, there probably isn't a product that's right for them; they're more interested in getting the test done and collecting their honorarium. This can lead to a sense that any product selection is correct and inflate basic metrics like task completion rates.

  5. Don't make the tasks dependent (if possible): It is important to alternate the presentation order of tasks as there is a significant learning effect that happens. If your tasks have dependencies (e.g., create a file in one task then delete the same file in another task) then if a users fails one task they will often necessarily fail the other. Do your best to avoid dependencies (e.g. have the user delete another file.)  This isn't always possible if you're testing an installation process but be cognizant of both the bias and complications introduced by adding dependencies.

  6. Provide context but keep the scenario short: You want to provide some context to get the user thinking as if they were actually needing to perform the task. But don't go overboard with the details. For example, "You will be attending a conference in Boston in July and need to rent a car."

  7. Task scenarios differ for moderated and unmoderated testing: The art of task-scenario writing has been honed over the years largely through moderated lab-based testing. However, if you're conducting an unmoderated usability test, it requires an additional level of refinement. You can't rely on a moderator to encourage users through a task and ask them what they'd expect.

    While you don't want to lead users and give them step-by-step instructions, you do need to be more explicit. You'll need to provide product names, specific price ranges and brands. While some people might be concerned that will lead the user, I rarely see a task-completion rate above 90% in unmoderated benchmark studies.  Even with all these details spelled out, users get lost in the navigation, the checkout procedures, or confused by simple things like terms, and overall organizations that aren't obvious to developers so close to a design.

It takes some practice balancing not leading users on one hand and not making the task too difficult on the other.  There are no universally "right" tasks, so don't be afraid to tweak details for different methods (moderated vs. unmoderated) or different goals (findablity  vs. checkout). It's even fine to read task scenarios out loud instead of having them printed or on the screen (we do this a lot with mobile testing). 

For more information on writing better usability task scenarios, one of the best sources is one of the classics: A Practical Guide to Usability Testing from 1993 by Dumas and Redish, A Practical Guide to Measuring Usability and Beyond the Usability Lab for unmoderated studies.


About Jeff Sauro

Jeff Sauro is the founding principal of MeasuringU, a company providing statistics and usability consulting to Fortune 1000 companies.
He is the author of over 20 journal articles and 5 books on statistics and the user-experience.
More about Jeff...


Learn More



You Might Also Be Interested In:

Related Topics

Usability Testing, Usability, Tasks
.

Posted Comments

There are 1 Comments

April 16, 2013 | Vlad Golovach wrote:

Jeff, first of all i'd like to thank you for your blog. In the really lowering landscape of professional writing on the internet your blog is on my list of required reading.

But second - could you by any way manage to put your books to Kindle or any other electronic format? I'd really like to by them, but paper is soooo from 18th century ) 


Post a Comment

Comment:


Your Name:


Your Email Address:


.

To prevent comment spam, please answer the following :
What is 2 + 5: (enter the number)

Newsletter Sign Up

Receive bi-weekly updates.
[6505 Subscribers]

Connect With Us

Our Supporters

Use Card Sorting to improve your IA

Loop11 Online Usabilty Testing

Userzoom: Unmoderated Usability Testing, Tools and Analysis

Usertesting.com

.

Jeff's Books

Customer Analytics for DummiesCustomer Analytics for Dummies

A guidebook for measuring the customer experience

Buy on Amazon

Quantifying the User Experience 2nd Ed.: Practical Statistics for User ResearchQuantifying the User Experience 2nd Ed.: Practical Statistics for User Research

The most comprehensive statistical resource for UX Professionals

Buy on Amazon

Excel & R Companion to the 2nd Ed. of Quantifying the User ExperienceExcel & R Companion to the 2nd Ed. of Quantifying the User Experience

Detailed Steps to Solve over 100 Examples and Exercises in the Excel Calculator and R

Buy on Amazon | Download

A Practical Guide to the System Usability ScaleA Practical Guide to the System Usability Scale

Background, Benchmarks & Best Practices for the most popular usability questionnaire

Buy on Amazon | Download

A Practical Guide to Measuring UsabilityA Practical Guide to Measuring Usability

72 Answers to the Most Common Questions about Quantifying the Usability of Websites and Software

Buy on Amazon | Download

.
.
.