Top Tasks Management @ Cisco

Gerry McGovern
9 min readJan 3, 2018

--

(Chapter 16 from Transform: A Rebel’s Guide for Digital Transformation)

From uploading software to helping customers download

The top tasks of Cisco customers are:

1. Downloading software

2. Configuration / set-up

3. Troubleshooting

Out of a final list of 67 tasks that customers voted on, these top three tasks got as much of the vote as the bottom 43. Since 2010, we have been running Task Performance Indicators, roughly every six months, to measure the performance of these tasks.

Cisco customer Top Tasks results

In 2010, when we started measuring software downloads, it could take a typical customer up to 15 steps and more than 300 seconds to download a piece of software. By 2012, for a significant percentage of software that had been reduced to an average of 4 steps and 40 seconds.

The key to the improvements was a change in culture and focus. It used to be that those involved in producing Cisco software felt that their job was done when they had uploaded the latest version onto the website. Now, they are focused on ensuring that customers can quickly and easily download the software. It’s a classic shift from old-model, input-based organization-centric thinking, to new-model, outcome-based customer experience measurements. Success is measured not based on what Cisco does, but rather on what the customer achieves. We move from measuring what we produce to customer use.

According to Bill Skeet, Senior Manager of Customer Experience for Cisco Digital Support, this has had a “dramatic” impact on how people think about their jobs. “We now track the score of each task and set goals for each task. We have assigned tasks and goals to product managers to make sure we have a person responsible for managing the quality of the experience.”

Once employees become focused on customer outcomes, their whole way of thinking and working changes. They get clear evidence on what is slowing down customers, what is complicating things, where they are getting lost, where they give up. For example, one of the software download tasks was:

Get to the download page for the latest release of software for Cisco’s 7920 Wireless IP Phone.

The following image shows Step 7 in the original process.

Download software webpage

The link that customers were supposed to click on was “IP Phone and Utilities”. A clear pattern emerged when people arrived at this page. They were confused and didn’t know which link to click on. The cursor kept moving back and forth, up and down, sometimes it would literally go around in circles. Sometimes it would stop dead still and then after several seconds, begin to tentatively move again. A typical comment from a customer was: “I’m assuming that one of these is the one I want.” It is often simple changes to links that have the most dramatic impact in improving task performance.

The Task Performance Indicator gives you the context the customer is in. Most other metrics tell you what is happening. TPI tells you why something is happening. You know what task they are trying to complete and that can really change your whole perspective. Jeffrey Davis, User Research Lead for Cisco, believes that there are two core questions for anyone working online:

1. What are your users trying to do? (NOT the same as what they are actually doing)

2. How well are they able to do what they are trying to do?

“It is vitally important to make the distinction between ‘what users do (i.e. click on)’ and ‘what they are trying to do,’ Jeffrey states. “In talking to stakeholders, they often list web metric data as synonymous with ‘what users are trying to do’ (if they have any data at all). That is a dangerous position since it assumes that all users are accomplishing what they set out to do in the first place. The beauty of Top Tasks is that it measures what users ACTUALLY are trying to accomplish and also whether they are able to be successful in those tasks. When you add the visceral experience of a stakeholder actually watching user after user struggle with key tasks and score it with numerical values, that’s a great research methodology that can really promote change within an organization — as it has done at Cisco.”

The worst way to design something is to have 5 smart people in a room drinking lattes. The next worst way is to have 15 customers in a room drinking lattes, telling you what they think they do and what they want. “Decisions in the past were driven primarily by what customers said and not what they did,” Bill Skeet explains. “Of course, that sometimes didn’t yield great results because what users say and what they do can be quite different.”

Fixing bugs and being in control

The Task Performance Indicator score is something you can control. If you don’t do anything about a particular task, the score will remain the same the next time you measure. However, if you make improvements, the success rate will increase. The following image shows a series of scores received by the bug fix task:

Ports 2 and 3 on your ASR 9001 router, running v4.3.0 software, intermittently stop functioning for no apparent reason. Find the Cisco recommended fix or workaround for this issue.

Performance of bug fix task: 2012–2014

For a variety of reasons, it was difficult to solve the underlying problems connected with finding the right bug fix information on the Cisco website. Thus, the scores from February 2012 to February 2013 did not improve in any significant way. For the May 2013 measurement, the team ran a pilot to show how, with the proper investment, it could be made a lot easier to find bug fix information. As we can see in the preceding image, the TPI jumped. However, it was only a pilot and by the next measurement, the TPI dropped again. The evidence was there though, and the team now got the resources to work on a permanent fix. The initial implementation was for the July 2014 measurement, where we see a significant improvement. More refinement was done, and by December 2014, the TPI was showing a major turnaround.

The Task Performance Indicator puts you in control of your own destiny, unlike other traditional customer satisfaction metrics. “We were driving the site experience by measuring ‘customer satisfaction’ with a monthly survey,” Bill Skeet explains. “This metric was a lagging indicator at best. It was very difficult to see the effect of our changes in that metric as customer ‘satisfaction’ is a reflection of a conglomeration of experiences.”

Being able to impact such a general metric as “customer satisfaction” has long been a challenge for organizations. If there’s a price increase, for example, then customers may well be dissatisfied even though they’ve had a perfectly good support experience. If you ask about satisfaction on a Monday, you can get worse scores than if you ask on a Friday. Customer satisfaction metrics have also been found to be a poor indicator of customer behavior.

Evidence-based decision making is something that very much suits the Cisco culture. “Consider the alternatives to evidence-based decision making and it is hard not to be a believer,” Bill states. “If you aren’t using evidence to make decisions, then decisions are subjective (rather than objective) and imposed by fiat. The evidence-based approach promises to be more scientific and therefore predictable and reliable. One advantage is that it “allows for rapid course correction. This allows teams to move faster and make occasional mistakes without long-term consequences. Unfortunately, many organizations have a history of executing monolithic projects in a ‘launch and leave’ fashion.”

Simplifying guest account creation

When the following task was initially measured, the results were not good:

Create a new guest account to access the Cisco.com website and log in with this new account.

Create a new account task performance: 2014–2015

In fact, during the March 2014 measurements, nobody succeeded in completing the task. After the March measurements, three specific design improvements were made. These involved:

1. Clearly labelling mandatory fields

2. Improving password guidance

3. Eliminating address mismatch errors

As well, a shorter pilot form was launched as a test. In the July 2014 measurements, success jumped by 50%! However, by the December 2014 measurements, the pilot form was no longer there, and success dropped by 21%. By the June 2015 measurement, the shorter, simpler form was fully implemented, and success had again reached 50%.

The team was able to show that:

· The three design improvements improved the success rate by 29%

· The shorter form improved the success rate by 21%

That’s very powerful. You can isolate a piece of work that you have done and link it to a specific increase in the TPI. You can start predicting that if we invest X we will get a Y TPI increase. This is control and the route to power and respect within the organization.

And if you can link it with other Key Performance Indicators, then that’s even more powerful. The following table shows that support requests connected with guest account registration more than halved as a result of the improvements made to the registration form.

Create a new guest account support requests: 2014–2015

A more simplified guest registration process resulted in:

· 80% productivity improvement

· Registration time down to 2 minutes from 3:25

· 3 people less required to support customer registration

At every step of the way, the Task Performance Indicator gives you evidence which can be used against the voluminous opinion which tends to exist within large organizations. For Jeanne Quinn, senior manager responsible for the Cisco Partner website, “It’s really helped us fight against some of the ‘bright shiny object’ disease and the tendency for everyone to have an opinion of what we put on our webpages — and where/how — because we have data to back it up. Our customers and partners do this — and not that — and when we organize content this way, they struggle, and when we organize it that way, they succeed! Clear and simple. Not easy to do mind you, but a fact-based approach that wins over colleagues and executives alike every time, and clears the way for us to make significant changes, and prove whether or not they are working for the most important folks — our partners.”

Never-ending continuous improvement

When we measured the ability of customers to change their passwords, we found that 37% of them were failing. A process of improvement was undertaken, as can be seen by the following chart, and by December 2013, we had a 100% success rate.

Changing password task performance: 2012–2013

100% success rate is a fantastic result. Job done, right? Wrong. In digital, the job is never done. It is always an evolving environment. You must keep measuring the top tasks because the digital environment that they exist within is constantly changing. Stuff is getting added, stuff is getting removed, and stuff just breaks.

Changing passwords task performance: 2014

When we measured again in March 2014, the success rate had dropped to 59% because of a technical glitch. The glitch was quickly fixed and in July, it was back up to 100%. It never ends. That’s digital transformation: from projects to processes, from organizational inputs to customer outcomes. A relentless focus on improving the customer experience by making it easier and faster for them to complete their top tasks.

Read the previous chapter: Continuously Improving Top Tasks

Read the next chapter: Switch

Buy print version of Transform for $12

Buy Transform ebook for $6

--

--

Gerry McGovern

Website top tasks management consulting for large organizations