Although it is natural to assume that customer research is just about talking with customers, real innovation tends to be in blind spots where customers aren’t even aware that they have a problem.
Nick Bowmast described a study from long ago when an upstart telephone directory company did site visits to discover the unique problem they would solve to differentiate themselves from the competition. Years ago, we all got great big books called “Yellow Pages” that had advetising from all of the local businesses in the area. I remember sitting on one as a booster chair for years. A team from the new company went out to observe how customers used the competitive product. They noticed that the large size contributed to where it was stored, and what was stacked on top of it. Observing this led them to the ah-ha that to be the first book someone grabbed, they needed to have a smaller form factor. They modified the size and immediately took over the market. No one had been complaining about the size. But their behavior suggested that it was a problem.
It isn’t that you don’t want to talk with customers or listen to their suggestions, and it isn’t that you want to ignore data, but when you want to create something new to the world, you are working in a space that goes beyond the experience of a traditional customer. One of the reasons you may want your technologists to go out and observe customers is that they may be able to see new opportunities that people listening to customers may not see.
I once went on a site visit to a small business person’s house who ran her business from a PC in her living room. We settled in after a chat about her business and I asked her to show us how she did her work. She talked through the experience of getting an order, fulfilling it, tracking it, invoicing, collecting payments, etc. In this process, she ended up pulling up Microsoft Excel and talking about how she created reports in Excel. We watched her enter information into the rows and columns and then taking a calculator that sat next to her computer, manually adding up the values in a column, then adding the total to the TOTAL line under the column. She was using Excel, which she freely admitted, but she was not using it as a real spreadsheet, she was using it as a layout tool for designing her reports. She didn’t know about the SUM function, and didn’t miss the fact that she needed one.
A team I worked with years ago had a payroll solution that frustrated customers. It was really painful to watch the customer attempt to use it, lose data, have to re-enter information, encountering a host of usability problems, etc. Of course, we wanted to improve them immedately. However, when we spoke with one customer about the experience, she claimed that it was easy, and gave the product 5 out of 5 on ratings including Easy to Use. We KNEW that wasn’t true. We had her on camera swearing at the computer when it lost her data. WHY? Well, she blamed herself.
You might think that these things don’t happen today. All of these stories are from long ago. But, you’d be wrong. I just heard about a group at a famous, successful tech company talk about the metric that they were using for a launch-no-go decision was a rating from the customer that the product was “easy to use”. No observation. No behavioral data. Just what the customer said.
A couple of years ago, Intuit shifted the way it measures success. Of course, they have all the traditional metrics: customer acquisition, revenues, etc. But, now the businesses are required to measure improvement in the benefit that they offer customers. When we rolled out this change, while I was still at Intuit, it was very difficult for people to find measurements that would meaningfully measure some of the benefits based on some of the products. In some cases, the teams didn’t actually know what the customer benefit was. In diving deeper, we realized that although we had a culture that encouraged teams to do site visits with customers, we hadn’t been focusing in on determining what was important to customers — WHY they chose to use our products. And, since we weren’t focused on that, we weren’t measuring it. Once we did start defining and tracking customer benefits, we could see whether our “innovations” and improvements were actually making an impact on what mattered most to our customers.
The same can be said of “internal customers” in employee-facing functions. All too often, we rely on employee surveys and the opinions of people leaders, without actually going out and observing to understand what the real challenges are and why employees are (or are not) adopting our solutions.
You might think that employees use the tools that we provide them with because they are compelled to use them. Finance, HR and IT, for example, invest a great deal of money and time implementing tools to be used by an entire company. Employees are told “This is the tools we use to report expenses.” and “This is the tools we use for performance management.” and they are expected to use them. There is often a huge “training” effort to get people “onboarded” into a tool, which is another big investment for those groups. However, problems inevitably occur when employees don’t use them, or don’t use them “properly”.
Over the past twenty years, I’ve worked with many internal teams to do observational customer research with their employees. Each time, the teams are shocked to see how people are/aren’t using the tools and the crazy work-arounds they have devised to have it meet their needs. Each time, the team reaches a deeper understanding of what is REALLY going on… information that is not reflected in survey data. Each time, new un-stated requirements and contextual details leads the team to open up to new ideas for solving the stated purpose of the tool.
Examples are helpful. Let me share 3 true stories.
- The failure of a performance management tool. Once upon a time, a company was trying to gain more consistency in their performance evaluations across the company, in order to better calibrate and to get better reporting. They found an enterprise solution that was out on the market and decided to implement it. This solution required a huge investment in the infrastructure, and the team spent a lot of time (using a waterfall development methodology) to collect requirements and set up the system with the skills, capabilities, levels and details from the company. They set up accounts for every employee at every level, with a built in reporting structure and deadlines and required fields… they did a massive training effort with leaders, managers and employees so they’d know how to use the tool. The effort took months (or was that over a year?) and was a huge investment. They rolled it out to the company with instructions to use the tool to do your annual reviews. People struggled. People revolted. Within days, a senior leader got up in front of the entire company and told everyone that they could use whatever tool they wanted, including PowerPoint. The important thing was the conversation with their managers. The ideal is to have regular development and evaluation discussions with your manager. A tool is just a tool. If it doesn’t work for you, don’t use it. By the end of that week, the performance management system was abandoned. ALL of that time, ALL of that money, ALL of it… for nothing. Why? While the team was focused on the needs of the organization, they did not focus on understanding the needs of the employees. They did not observe employees to understand what they were currently doing and why. They did not strive to underestand what was in it for the employees. It wasn’t that they were inexperienced, it was that they didn’t ever think about the employee — and WHY they would (or wouldn’t) adopt a performance management solution.
- The success of a Finance Reporting Tool. Once upon a time, a team of finance operations professionals was created from within a larger finance function in a mid-sized corporation. They were tasked with addressing an issue that bubbled to the top of an employee survey from their business partners. Consistently, surveys showed that the business partners were dissatisfied with the finance tools they were using. People complained that they weren’t helpful, they were hard to use, and that the tools didn’t provide answers to their questions. The team knew they had a problem, but really weren’t sure what the problem really was. So, they divided into pairs and went to sit with their business partners, to watch them use the tools and to ask them about their experiences along the way. Each pair observed and wrote down the problems they saw and heard, then they came together to share what they’d learned. Their first ah-ha was that it wasn’t just one or two little problems, but there were issues with everything the employees were trying to do. They honed in on the most painful and prevalent of these problems to try to address: The headcount issue. They had multiple tools that reported on the open/available headcount in an organization. The problem is, because of the way these tools calculated headcount, the data was different depending on where you looked. So, if a leader wanted to know how many openings they had in their organizations, they would have to take their best guess. This was a problem that would hit every organization in the company and was extremely painful to those who relied on this information as a part of their job. So, the team prototyped and experimented with a variety of solutions and landed on a simple tool that pulled in data from all of the other systems and calculated a consistent value. They built their prototype in Excel and the ultimate solution was really just a step up from that. It did take them a few weeks to explore the problem and various solutions, but in the end they completely eliminated the problem for their business partners. Huge win based on simply understanding what the underlying need was for the customer and delivering that benefit.
- The First Day Experience. Once upon a time, an HR team in a large tech company decided to address some rumblings from new employees about how their first day wasn’t quite what they’d expected. In order to understand what the experience of being a new employee was, they decided to shadow new employees on their first day of work. They started with the moment that a new employee drove onto campus. They followed them the entire day until they drove off the campus. They did this with only 10 employees one day and were horrified at what they saw. They saw people lost, having to climb under their desk to setup their monitors, eating lunch alone and spending a huge amount of time trying to get their badge, tools and access to their computers. Overall, it seemed like a confusing and lonely first day, which didn’t jibe with the vibe of the company or their values. The team brainstormed and tested out ideas and ultimately designed an end-to-end experience for employees first days that helped celebrate their arrival and welcomed them into the company. The ultimate experience included everything from a dedicated parking space to a campus-wide tour, to a pre-setup badge and computer to a group lunch to a handwritten card and “swag” from their managers, and a cascade of interactions, training and activities to be rolled out from corporate down to their own team over the next two months. The rumblings stopped and new employee satisfaction went up. All because the team took a day to understand the employee’s experience.
Now, with these three stories, the team was alerted that there was a problem. But, remember: real innovation tends to be in blind spots where customers aren’t even aware that they have a problem. So, the key to really making a great EMPLOYEE experience is to employ that same curiosity and observation to any experience that you touch. When was the last time that you did a “site visit” with your employees?