If time travel were possible, what would I tell myself as a junior UX researcher?
Lessons learned from a seasoned UX Researcher
I don’t like time travel movies. Although I passed my college theoretical physics class, I don’t buy the notion that disrupting the past will affect the future. Can Marty in Back to the Future really prevent his own existence based on his interactions and advice to his father?
Still, it is curious and makes me think. If I could go back in time and talk to myself as a rookie UX Researcher, what would I tell myself? I’ve worked on a variety of user experiences over the years. I’ve built products that assisted small businesses in creating online stores and those that helped the elderly get online for the first time. I’ve helped locals navigate their neighborhoods, and, today, I help students navigate their way through college. However, with that variety comes a variety of mistakes and failures.
After some careful thought, here are 3 things I did wrong and 2 things I did right:
3 things I did wrong
I included EVERYTHING
One of the most in-depth ethnographies I have ever conducted involved understanding how people discovered events and things to do in their neighborhood. I had people track their social events and entertainment activities over a 4-week period and visited homes across the state to conduct interviews. After flying home from the field, I met with my manager and dove into all the exciting insights I uncovered. After about 30 minutes of me talking non-stop, she responded, “That’s great Mark! Now what are you going to tell your team? Because you don’t have time to tell them all of that.”
As a new researcher, everything was exciting to me. Curiosity was one of the reasons I became a researcher. When I had a mountain of data, I thought the next step was to simply organize and share this mountain. Prioritizing was simply a matter of what to show first. I even had an “Other” slide at the end where I dumped miscellaneous insights which just didn’t fit elsewhere.
The problem is, once everything is important, then nothing is important. Determining your ‘key takeaways’ can be a challenge for any presentation, and it is especially challenging for a researcher full of curiosity.
These days I just drop data from the report. Prioritization is a matter of the ruthless cutting of insight ‘fat’ and crafting a leaner message. For a tactical project (i.e. usability/lab research), this requires looking back at the key questions and hypothesis, addressing those issues first and foremost. What are the key decisions? What are the key points of friction? If it is a usability issue, what is the level of severity? What is going to lose customers vs be an annoyance?
Exploratory research can be more challenging because the whole goal is to explore and learn what we don’t know. In these studies, I listen to my stakeholders to understand what is standing out and exciting from their perspective. I try and distill the data into key principles or themes and let the smaller insights come out in the user stories I tell around those themes.
Don’t get me wrong, I still analyze all the data and follow my curiosity. I just don’t always communicate it, and I keep those ‘scraps’ of data in my back pocket in case those insights become relevant later.
I underestimated the longevity of research
I once spent 6 months tracking a group of students as they looked for internships. It was a fascinating topic and, at times, emotional as I listened to their struggle to find their career path. About midway through the project, the company focus had shifted away from internships.
Now, had it been earlier in my career, I would have ditched the project. It’s not relevant anymore, better to move on, right? However, being more seasoned at this point, I decided to stay the course and see the project through. This wasn’t my first rodeo and I have seen projects pivot and shift in the past. I understood an important point about research shelf life: that even if the learnings may not be relevant now, they could be relevant later, and to look beyond the immediate needs of the team. About 16 months later, internships were back in focus with the team asking questions we had already uncovered and I was able to showcase these insights.
Things can change very rapidly in our industry. Concepts and questions may arise, but for many reasons (unclear business model, undeveloped technology, conflicting company strategy, lack of the right personnel), the project is just not ready at that point of time. However, the questions still came from a genuine interest, relevancy, and need to understand.
I once took over a team from a previous researcher who conducted fascinating projects and collected a wealth of data. My team had questions around a topic which that researcher explored a year earlier. I dove into the report; it was short and direct (X did not work, do Y). Really effective and written for the team at the time. However, it was of no help for our team at this moment. The context was not explained, nor the core needs illustrated. Why didn’t X work? In short it had no lasting impact.
These days I try and think beyond the team today and think through what could someone 1 year or even 5 years out utilize from this research? Thinking in this manner also helps me analyze the research more broadly, looking to uncover base user needs and behaviors, rather than just answer the question at hand.
I relied on one moment for communication
Early in my career, I thought the process after conducting the research was simple: Create a report then send/present the report. I’m now done with that project and I move on. What I learned over the years was: 1) not everyone reads reports, 2) not everyone comes to a presentation, 3) not everyone, including myself, always remembers or applies these insights moving forward.
Even when I tried other different methods such as quick immediate debriefs, affinity mapping, or interpretation sessions, it always came down to the fact of relying on the participation of the team — at that moment.
Communicating insights is an ongoing process. It is not one report, one presentation, or one workshop. Sure, I will do my best to optimize getting all the key stakeholders in the same room for a discussion. I know that is an important moment and I will strive to plan around it. However, I know now that that isn’t necessarily the only moment.
Now I use many methods of sharing for a wide range of audiences. Communication for me takes many different shapes and sizes: small debrief sessions, larger presentations outside my team, hallway conversations, synthesizing my insights into other people’s work. I’ve even given presentations to just 1 person. It’s awkward, but it has also led to some of the most interesting conversations.
I share, share, and share some more. This includes revisiting data that my team should have already internalized months ago. I don’t expect my team members to always keep the insights top of mind. The context of the product may have changed such that referencing the insights again may lead the team to reach a new solution.
The illusion of “the one moment” is even stronger when you are sharing insights with the “big dogs.” Their titles may vary per organization, but you know who these people are. Their name is usually preceded by the phrase “we need to get this in front of….”. I used to think if I can show and convince “so-and-so,” I’m set. While communicating to these people is critical, it’s not the only moment. I’ve seen inspiration and change come from all levels, sometimes from places and people I’ve least expected it from and much lower on the org chart.
Communicating insights in the past used to feel like checking off boxes, especially when I’ve been in environments where communication standards were in place. Communicating insights today feels more like being an entertainer on a promotional tour trying to get the word out to as many people as possible.
2 things I thought I did wrong but were actually right
I didn’t fake it
I never heard the phrase “fake it till you make it” until I was well down my career path. And it’s probably just as good because I think that is horrible advice for a researcher.
As a young researcher, I was always brutally honest. I made it clear to everyone that I was junior, had a lot to learn, and I did not have all the answers. One of the first projects I ever worked on was an extremely complex B2B application which required me to go out and talk to some brilliant people on very technical topics. As a junior researcher, I felt totally out of my element. I said flat-out to the senior researcher mentoring me at the time, “I don’t think I’m qualified to do this. I have no idea how these things work.” It was embarrassing, but it was how I felt. My mentor responded: “That’s perfect, that means you won’t be biased.” This is exactly what I needed to hear as it opened the discussion for what I didn’t know and what I needed to know.
Saying you don’t know is important for a researcher. It’s a powerful phrase and tool for research. By saying “I don’t know,” you:
- Help establish the boundaries of the research. No research project or method is perfect. Each one is designed to answer specific questions. Be clear in saying what questions your research answers and doesn’t answer.
- Open the discussion to what we need to know. Sometimes research comes away with as many questions as it does answers. Saying “I don’t know” adds curiosity. We don’t know so let’s go find out.
- Give more credibility and importance to what you do know. As a researcher, your influence is all built on establishing your credibility. If you’re just making a guess at a question or insight, how are your stakeholders supposed to believe you for the things you really do know?
Don’t fake it. Don’t say you know, when you don’t. Take this as the opportunity to say, “I don’t know. I’ve never done this. Let’s go learn.”
I valued the power of “why?”
In many of my first research projects I barely ever had enough time to cover all the questions and topics. In one project I remember apologizing to my manager because I was cutting it so close and running over the session time. She responded: “It’s ok. You’re just being thorough.”
I was thorough. Extremely thorough. A lot of it had to do partly with my curiosity and partly with my naïveté and eagerness to get what I was uncovering — the insight, right. I was always asking why and hoping to be absolutely certain as to what we were learning. “Why did you do this?” “Why do feel this way?” “Clarify for me what you mean by this?”
But at the time, I didn’t necessarily think this was a good thing. At the time I thought I was slowing everything down, wasting everyone’s time probing and clarifying. I thought I needed to cut to the chase and produce faster.
I know now that what I was doing was exactly right. Asking ‘why’ is one of the most important questions for UX research and especially qualitative research. It is not enough to say users did this or said that. Explain why they exhibited that behavior or sentiment. And more importantly, why it matters.
As I have gained experience and become more certain and confident, there is a tendency to take short cuts. Maybe I’ve seen a behavior before, or I know a product landscape inside-out, making it easier to jump to conclusions. However, I should listen and learn from my past self to still be thorough and keep asking why.
Hindsight is always 20/20
I haven’t had a chance to get my hands on a time machine yet, and I don’t think I want to. Reflecting on these mistakes now, I don’t think I could have become as effective a researcher today without learning from these mistakes. If anything, I’d tell myself to relax, embrace failure, move on and reflect. Who knows, you may be doing right by yourself in the long run.
For the time being, I think I’ll avoid pushing that DeLorean to 88 miles an hour. What about you? If you could go back and have a conversation with your junior self, what advice would you give?