{{ searchResult.published_at | date:'d MMMM yyyy' }}

Loading ...
Loading ...

Enter a search term such as “mobile analytics” or browse our content using the filters above.


That’s not only a poor Scrabble score but we also couldn’t find any results matching “”.
Check your spelling or try broadening your search.


Sorry about this, there is a problem with our search at the moment.
Please try again later.

I firmly believe that observing real users doing real tasks is the 'gold standard' for usability testing, particularly when the designers observe it themselves and see the problems only real users can find. 

However, sometimes full user testing falls outside the budget and the project manager will decide to use an expert usability assessment instead. 

This works well for websites where an expert usability consultant can put themselves in the shoes of the user and work through typical tasks identifying critical usability issues.

But what if the system supports far more complex tasks, which users take years to learn?

Usability checklist

One solution is to provide the expert users with a checklist based on International Standard ISO 9241-110 to help them judge how well their system meets best practice usability principles.

The seven principles which need to be checked are:

  1. Fit with user’s task. The interface should match the way users perform their tasks.
  2. Signposting. The users should be able to tell from the screens where they are in the task and what they can do next.
  3. Intuitiveness. The software should respond to the task context and follow accepted conventions for the target users.
  4. Learnability. The software should support learning.
  5. User control. The user should be able to control the interaction.
  6. Error tolerance. The software should minimise the likelihood and severity of errors.
  7. Customisation. Users should be able to customise the software to suit their skills and preferences.

For each principle the checklist needs several specific component questions. So for example, under principle two, Sign posting, the user is asked 'Is it clear what input is required?' To which the user answers 'always', ‘most of the time', 'sometimes' or 'never'. 

When users identify critical issues (for example, a screen where it is not at all clear what input is required) it is a good idea to take a screen shot to capture and illustrate the problem. 

At the end of each section, the user should be asked to give an overall judgement of the system (as a percentage) for that principle.

The results

The results are usually presented graphically and are particularly useful for benchmarking and for exploring why different users have a different experience.

For example, the diagram below shows three users’ judgements for a decision support system.

usability spider diagram

The end result is a structured record of expert users’ judgements on key usability issues and lots of specific examples of bugs for developers to fix. 

It is not quite as powerful as regular usability testing but it is practical, quick, efficient and revealing.

Tom Stewart

Published 26 September, 2011 by Tom Stewart

Tom Stewart is Executive Chairman at System Concepts, and a guest blogger at Econsultancy. System Concepts can be followed on Twitter here, and Tom is also on Google+.

35 more posts from this author

Comments (2)

Andrew Lloyd Gordon

Andrew Lloyd Gordon, Digital Marketing Expert, Speaker and Trainer at New Terrain Limited

Hi Tom

An interesting piece thank you. I've not come across this approach before.

One question to clarify please...

With Step 1 of the 7 Step Process, if you've been unable to watch real users performing their task, how can the Expert Usability Consultant be certain that, "The interface should match the way users perform their tasks."?

I think I'm simply being dumb and missing the point :0

about 5 years ago

Tom Stewart

Tom Stewart, Founder at System Concepts

Hi Andrew
Sorry, it's my fault for not being absolutely clear - the judgements are made by the expert users (ie domain experts) not the usability specialist whose role is to help the users understand the issues and make their judgements. So with regard to Step 1, I have found users quite happy to say how well the system matches the way they work (and give insightful examples when it doesn't).
Hope that makes sense!

about 5 years ago

Save or Cancel

Enjoying this article?

Get more just like this, delivered to your inbox.

Keep up to date with the latest analysis, inspiration and learning from the Econsultancy blog with our free Daily Pulse newsletter. Each weekday, you ll receive a hand-picked digest of the latest and greatest articles, as well as snippets of new market data, best practice guides and trends research.