Twist your tools: Test an IVR phone system on the web

Posted on September 16, 2013 by Lisa

One of the joys of UX research is the chance to come up with a creative solution to a research challenge. In this case, I had a short time to validate the menu design for a huge interactive voice response (IVR) telephone system. Many people still call in to the phone system of this government agency, usually after first seeking the answer on the web site. Instead of recording the script, I wanted to have a collaborator act as the system. The challenge was the 40 page script -too long for them to leaf through quickly enough to be convincing. The solution was to use Optimal Workshop’s  Treejack to drive and display the script. It worked so well that we were able to add an alternative version of the script into the study within the original time frame.

The Challenge: Design and test A & B versions of a complex voice interface

The government team was new to working with a UX researcher early in the process. They had designed a traditional  ‘press 1 – press 2’ navigation menu for the voice system and expected to record it first, then have the research study validate the design.   I persuaded them to run the study before recording to get the most value. The other advantage of early research is to open up the team’s mind to alternatives, in particular, an alternative design that more closely matched the website menus, since most callers had already been on the site.

Thinking about the web menus opened up Treejack as the solution to the unwieldy script. I had often used Optimal Workshop’s Treejack to test web menus and navigation (in fact I wrote about it here [1]).  We decided to feed the long script into TreeJack, and then have a live collaborator read the menus and click in the participant answers. Once it was in Treejack, and we could visualize and try out use cases, we could quickly and easily create an alternative design that matched the site menus.

Step 1 – Process the script into Excel and import into Treejack

The next challenge was to get the script into Treejack. It didn’t read like a tree, but it did have a hierarchical numbering scheme. Our developer came up with some Ruby code that read in the MS Word script document and used the numbering scheme to spit it out into an Excel spreadsheet formatted for Treejack. Once the script was successfully entered into Treejack, we added the top tasks the participants would complete, and now had a fully interactive process for using the scripted menus.

Process of Word to Excel to Treejack [4]

Step 2 – Pilot test with 3 people – facilitator, collaborator & participant

We ran a pilot session with a local participant, and were thrilled at how well the process worked. Each session required two Neo Insight staff – one as the facilitator, and one as the collaborator live phone system. We connected over Gotomeeting, with the Treejack study running on our shared screens, and the participant connected into the meeting over the telephone. To complete the tasks, the participant on the phone simply spoke their menu selection (“press 2”) rather than press the dialpad button. The person playing the live phone system would click their selections into Treejack, and it would display the appropriate next menu or message, which he then read to the participant. Later, we were able to use both the videos of the sessions and the Treejack path and summary data to analyze participant performance and behaviour across the different scripts.

3 people and their roles [5]

Step 3: Create an alternative voice menu that matched the website

While participants were recruited and scheduled, we copied the scripted Excel file, and quickly changed the top level design of the tree to match the website menus. The idea was to take advantage of the caller’s experience with the site menus to ease their navigation through the audio menu. We imported this ‘B’ version into a duplicate Treejack study, and created a facilitator guide that had half the participants do half the tasks on one tree, and the other half on the other.

Results and insights

We ran 12 participants across 6 tasks. Our data showed that people were both faster and more accurate with the design that matched the website menus, which the government team later recorded and implemented.  The study also caught some issues that helped persuade the design team to move the most frequent-chosen options to the top of the menus and to better differentiate some of the menu options.

The results also insights that we couldn’t have learned from testing the recordings. Our collaborator was human, and couldn’t help acting that way!

For example:

  •  When participants selected the ‘Play the menu again’ option, the collaborator responded naturally and spoke more slowly the second time through. Since it’s possible to vary the speed of the recording announcements, we recommended a slower speed for replaying the menu.
  • Several participants selected to hear a particular menu again that had very similar option in it. The collaborator emphasized the word that highlighted the difference  between the options as he spoke the menu the second time.  We recommended adding that emphasis into the script for the recordings.

This voice response study was fun and successful. It introduced the team to the benefits of early UX research and of creating alternative designs, and applied the evidence to create a more usable telephone voice response system closely tied to the agency’s website.

 

 

 

 

URLs in this post:

[1] here: https://blog.optimalworkshop.com/can-remote-tree-testing-predict-moderated-results-a-fascinating-study-by-a-ux-expert

[2] Optimal Workshop: https://www.optimalworkshop.com/

[3] Treejack: https://www.optimalworkshop.com/treejack

[4] Image: http://neoinsight.com/components/com_wordpress/wp/wp-content/uploads/2013/09/script-process.png

[5] Image: http://neoinsight.com/components/com_wordpress/wp/wp-content/uploads/2013/09/voice-study-design.png

Copyright © 2015 Articles. All rights reserved.