Task performance measurement using Loop11

In our November newsletter we challenged you to try two tasks on each of two online bookstore websites – Amazon.com versus BarnesandNoble.com. We had 45 people respond to the challenge.

The main results are shown below from a screen capture of the Loop11 dashboard.

Loop11 dashboard showing overview of test results

  • Wording of Find video task
  • Using the <name of bookstore> website, find the detailed description page for the blu-ray disk version of the movie “You’ve Got Mail”


  • Wording of Find magazine task
    • Using the <name of bookstore> website, find out how long you would have to wait to start receiving a subscription to Smithsonian magazine.


Performance was 85-88% for all but the video task attempted on the Barnes and Noble website. In that task, 23% of people thought they found the correct answer but they did not have the Blu-ray version.

The purpose of this summary is not to cover the results in detail but rather to show an example of benchmarking competitive sites. This is the big advantage of Loop 11. It allows you to do comparison benchmarking with websites over which you have no control. The program does not require you to put any code on the site to be tested. Test participants also do not have to download or install any code.

Unfortunately, it doesn’t work well with all websites. Long delays in page loads are encountered on some sites, specifically many Canadian government websites we’ve tested. There now is some code available to put onto these sites to overcome this delay but you do have to have administrator access to the website.

Loop 11 does provide a free preview mode so you can see if the websites you want to test will work well before setting up an account. The cost of a project is a flat fee of $350US. A project can include up to 1,000 participants with an unlimited number of tasks and questions.

Loop11 is also good for unmoderated A|B testing where you want to compare two design options or before/after scenarios. As with any unmoderated testing, getting the task wording crisp, unambiguous, and non-biasing is absolutely critical to success.

Loop11 generates a comprehensive results report in PDF format. You can also download an Excel formatted spreadsheet for further analysis. The summary data provided in the report are:

Overall data

  • Completion rates (success, fail, abandon)
  • Page views required
  • Time to complete
  • Most common success pages
  • Most common first clicks
  • Most common click stream (actual sequence of links and pages)
  • Most common abandon page
  • Answers to open ended questions

Detailed data by participant

  • Total time
  • Average time
  • Average page views
  • IP address
  • Browser
  • Date and time
  • Detailed path analysis

Back to Top

This entry was posted in Online UX research, Remote usability testing and tagged . Bookmark the permalink.

Comments are closed.


  • +1 613 271 3001 Main
  • +1 866 322 8522 Toll-free
  • +1 866 232 6968 Fax

Follow Us    Twitter   LinkedIn

© 2012-7 Neo Insight. All Rights Reserved