What is
Usability Testing?

Defining the word most agencies get wrong.

Written by Stephen Courtney, CRO and UX Strategist | 17/01/22

Conversion Services | 9 MIN READ

Usability testing is a part of the user testing process, showing how easy it is for visitors to complete specific tasks. Designers and developers build websites with particular actions in mind, and usability testing shows whether a design facilitates those behaviours. Knowing when to do usability testing, understanding how it works and learning to read the results are essential skills for a design and development team.

Usability testing is a task-oriented form of user testing that focuses on the ease of performing critical functions. It can be incorporated into the development process for a new website or used to evaluate an alternative UI design alongside, or instead of, A/B testing.

Stephen Courtney, CRO and UX Strategist

Usability testing shows how easily first-time visitors can perform essential tasks on a website or app. Unlike other forms of user testing, usability tests focus on pre-defined goals and actions, rather than experience or affect.

During a usability test, your participants work through a series of tasks that test your website’s most essential functions. Whilst the premise is simple, it is easy to skew results by choosing the wrong tasks, sampling the wrong people, or falling victim to experimenter biases.

What is Usability Testing and how does it work?

Designing, building, and updating a website always raises two big questions:

  1. What do my users need to do?
  2. What do I want them to do?

Any tasks that answer both questions will account for most of the visits you receive once your site goes live. When optimising your conversion funnels, they will also become the micro-conversions that a CRO agency targets. For example:

  • Searching for and finding a product
  • Registering for an account
  • Signing up for a mailing list
  • Completing a contact form
  • Making a purchase

During a usability test, trial users have a series of tasks to complete on your website. The insights you gather from their responses help you to perfect your design.

Usability testing in six steps

Step 1. Sampling: Define your target user

Defining your target user is essential because you need your sample to represent your actual user base. Excluding key demographics means you risk launching a flawed design whilst throwing the net too wide wastes time and money. The representativeness of your sample will also be shaped by how you recruit your participants.

When targeting participants, you should think about:

  • Age: Are your target users younger or older?
  • Location: Will, your users, come from a particular town or place?
  • Social background: Income and social background may influence how users interpret your content
  • Interests: Users with a particular interest may be more familiar with specialist language.
  • Accessibility: Your sample should include users with accessibility issues such as impaired sight or colour-blindness. Estimates suggest around 1/12 men and 1/200 women have colour vision deficiency, which means some designs could exclude over 8% of your users.
  • Screening: Consider if any factors disqualify a participant. For example, a home insurance website should only sample homeowners.

Once you have decided how to target your sample, you need to collect participants. Unfortunately, recruitment can be time-consuming and expensive, but there are three main ways to do it:

  • Third-party recruiters: This means hiring participants from agencies or SaaS platforms like Usertesting, Testing Time, and User Interviews. It’s an expensive option, but it allows you to segment your target users easily.
  • Custom sampling: Combining PPC adverts on Google or Facebook with an incentive (such as an Amazon voucher). The targeting of your ads can be reasonably precise, but you will still have to filter the responses carefully.
  • Informal sampling: For example, sending requests to a mailing list, customer database or personal network. This method is the cheapest option, but you can’t sample your participants as effectively.

Step 2. Task list: Isolate your key actions

It's easy to forget about simple features like navigation tools or catalogue filters. Unfortunately, even minor elements like these can end up forming a bottleneck. When UX experts analysed Marks and Spencer's website following its infamous 2014 launch, they found thousands of lost sales caused by simple UI errors. Usability testing can help prevent something as simple as a size filter from derailing your next launch.

Analytics data can help you uncover the micro-conversions that inch a user towards your primary goals. For example, there are two simple Google Analytics reports that show which actions your users take before a full conversion:

  • Top Events (filtered by converters): Depending on how you track events, Google Analytics' events reports can show you the actions your converters take most frequently.
  • Multi-Channel Report: Google Analytics provides a Multi-channel Funnels Report that shows you the journeys leading to a given conversion. If email plays a crucial role in your most common multi-channel journeys, you should include actions like completing a sign-up form in your task list.

Multichannel report in Google Analytics

How to Write Tasks for Usability Testing

Usability testing should interrogate specific parts of your website without "leading" your participants' behaviour. Your tasks need to provide context and include an achievable aim to do this. For example:

"You're a runner with size 10 feet, and you would like to buy a pair of trainers for under £50. You don't care about colour or brand, but the trainers must have good reviews. Starting from this landing page, find two suitable options and pick a favourite."

Here are some simple tips to help you write an effective task list:

  • Keep it natural – there's no point asking a participant to try something nobody would do in real life. Your website must trigger the typical habits and schemas, but you can only test if your tasks are realistic.
  • Use context rather than instructions – finding out if your users can intuit solutions is what usability testing is for. Backstory lets you guide participants rather than telling them how to do something.
  • Don't give clues – Your tasks should provide enough information for someone to complete them without prescribing the steps involved.
  • Move from easy to challenging – user confidence is fragile. If your tests begin with a challenging task, your subjects will be sceptical before hearing the others.
  • Be clear and specific – when someone is struggling to complete a task, they often substitute one they can do. Removing ambiguity from your task list will help you clearly distinguish successful attempts.
Chat with us about helping you with your Usability Testing needs

Step 3. Staging: Set up your prototype for testing

Usability testing can be moderated (with an observer watching in real-time) or unmoderated (with the sessions reviewed via recordings). Both approaches have advantages, and you also need to decide whether to run your sessions remotely or in person.

How to moderate your usability testing sessions

  • In-person, moderated usability testing means you can meet your participants face-to-face and build a rapport. It lets you pick up on non-verbal cues when a task is especially difficult or frustrating, helping you to find more sophisticated insights. Unfortunately, moderated usability testing is the most expensive option, and it also comes with a greater risk of experimenter biases.
  • Remote, moderated usability testing, using an online testing platform, introduces a greater distance between participant and moderator. That means you can ask follow-up questions with less risk of bias answers. However, it is much more challenging to keep your participants engaged when separated by a screen.
  • Remote, unmoderated usability testing is cheaper and easier to scale than the other options. There is less risk of experimenter effects biasing your results, and you can gather material with little effort. However, the insights you gain are likely to be more superficial, and there is no opportunity to ask the follow-up questions that make them actionable.

Staging a usability test

How to stage a usability testing prototype

Your participants need to experience your website as they would in an ordinary browsing session to run usability tests. While testing a live website is relatively straightforward, pre-launch testing requires a high-fidelity prototype or a staging environment.

What is a staging environment?

A staging environment duplicates your website code, is hosted on separate URLs and is not indexed by search engines. Staging environments let you test complex updates before pushing them live.

Step 4. Pilot: Do a dry run

No matter how you decide to stage and moderate your testing, it always involves a significant investment. A pilot, conducted in the days before your scheduled test, lets you fix technical bugs in advance, preventing waste of your time and money.

As a moderator, running a pilot session also helps you develop routines and procedures. Better technique makes your sessions run smoothly, puts participants at ease, and prevents common biases.

💡 Usability testing tip

If possible, record your pilot session and review the footage. Observing yourself from a participant’s perspective will improve your moderating technique.

Step 5. Testing: Gather your data

The role of a moderator is to communicate the task list, create a natural environment and observe the session. It's important to put your participants at ease and ensure consistency throughout your sessions. The best way to do that is to prepare a script.

How to write a usability testing script

  • The preamble: Your introduction should reassure your participants and outline the session. The preamble is an excellent opportunity to ask questions that may explain anomalies in your results (for example, you might ask whether the participant is familiar with websites like the one you're testing).
  • Introducing tasks: You can write your tasks down, read them out or do both. However you communicate them, you should also take the opportunity to reduce any demand effects (for example, by explaining that there are no right or wrong answers).
  • Feedback: It's often helpful to prepare a list of neutral phrases that you can turn to when a participant gets stuck or completes a task. Not only does that help make each session run smoothly, but it also prevents the introduction of any bias through spontaneous responses.
  • Between tasks: Moving from one task to another provides an opportunity to ask follow-up questions. To ensure these questions don't prompt answers unfairly, write down some open-ended questions in advance.
  • Debrief: Once the task list has been completed, you can adopt a less guarded persona and interact with the participant naturally. Remember to thank them for their time and answer any questions they have.

How long should a Usability Testing session last?

Although there's no formal answer to the question of duration, participants will quickly lose concentration once they start the task list. If you are moderating numerous sessions in a single day, you should think about your stamina and plan breaks between sessions to prepare for each one correctly.

Step 6. Analysis: Interpret your results

Alongside the qualitative insights your sessions provide, usability testing often incorporates a range of quantitative methods that let you measure the difficulty of your key actions.

The simplest way to quantify your participant’s responses is to record every attempt at a task as a success or a failure. Then you can calculate the success rate for each task and compare it to an alternative prototype.

Task 1 - Completed Task 2 - Completed Task 3 - Completed
User 1 No Yes Yes
User 2 No No Yes
User 3 No Yes Yes
User 4 No No Yes
User 5 No Yes Yes
Success rate 0% 60% 100%

More complex metrics allow you to measure things like the learning curve. For example, you can record the average time required for experienced users to complete one of your tasks and compare it to the equivalent for first-time users. That comparison shows you the extent to which familiarity improves performance (the “Experience Gap”).

Task 1 - Secs Task 2 - Secs Task 1 - Clicks Task 2 - Clicks
Experienced User 83 49 8 5
Experienced User 45 31 4 5
Mean 64 40 6 5
Test User 1 98 (+34) 55 (+15) 4 (-2) 5
Test User 2 83 (+19) 68 (+28) 6 7 (+2)
Test User 3 49 (-15) 45 (+5) 10 (+4) 5
Test User 4 87 (+23) 34 (-6) 8 (+2) 5
Test User 5 110 (+46) 49 (+9) 8 (+2) 7 (+2)
+/- mean +21.4 seconds +10.2 seconds +1.2 clicks +0.8 clicks
Experience gap 33.4% 25.5% 20% 18%

If you have a large enough sample, you should exclude outlier results to prevent extreme cases from skewing your data. However, these quantitative measures should only be used in conjunction with broader insights since the sample size and experimental controls are rarely sufficient to produce genuinely representative data.

When To Choose Usability Testing?

Usability testing allows you to evaluate prototypes according to clear and limited criteria. The results can build confidence during the early stages of production or identify aspects of a live site that require optimisation. For this reason, usability testing is a key resource for eCommerce and retail websites, where a large proportion of sessions follow a common pattern.

B2B service websites, and those focusing on branding, may find the insights from usability testing too limited. In that case, traditional user testing would provide richer insights about affect and experience.

Conclusion: What is Usability Testing?

Usability testing is a task-oriented form of user testing that focuses on the ease of performing critical functions. It can be incorporated into the development process for a new website or used to evaluate an alternative UI design alongside, or instead of, A/B testing.

It is possible to run usability testing sessions as an internal marketing department with the right tools. However, most online businesses prefer to utilise a dedicated user testing agency with experienced moderators and analysts. For more information on incorporating user testing within your organisation, explore our guide to choosing the right user testing method.