Photo by Helloquence on Unsplash

Testing Fast And Cheap: A Quick Guide To UX Research. Part 2.

Pre-design stage: Idea Hunting

Galina Kalugina

--

A journey of a thousand miles begins with a single step. The first step for any design task is pre-project research. At this point we’re not looking to verify any solutions, we are to check as many options as possible as fast as we can. I suggest two ways of doing so. They work great together, as well.

Competitors analysis

Competitors analysis is a marketing term meaning the assessment of strengths and weaknesses of existing and potential competitors. In our case, it also includes spotting patterns and defining best practices for a particular type of software. Learning about solutions for the specific field would help you to get nuts and bolts of the design for this kind of products and avoid reinventing the wheel.

When it works best

Though it’s not a research method per se, this exercise is the only thing which is mandatory for all projects. I recommend to do it before the kick-off meeting, so you’ll be able to ask vital questions right away (and look savvy).
Tip: don’t hesitate to look into indirect competitors as well. For example, designing a cryptocurrency trading platform, you may want to check out traditional currency or stock trades. Creating a mobile app for event planning, make sure you’ve explored corresponding features on social networks, e.g., Facebook.

Which tools to use

Aside from screenshots of your competitors’ products, you may also use some stats for reference. In this case, here’s where you go:

How to conduct

This little study can be very beneficial if done with particular goals in mind. Here’s my list:

  • Learn the subject area
    When you absorb a great deal of material in a short time, you get able to reverse-engineer the requirements and understand the subject area a little bit. It won’t equip you with universal knowledge, but that’s a good start.
  • Define common patterns and features
    Familiar patterns don’t take much time to get used to, so they feel intuitive. Putting together a little library will help you to find optimal solutions fast later on.
  • Note the insights
    Right down good ideas. Explain why you think they work better than any others. Log your thoughts, but don’t get overly attached to them — what looks great on paper may be technically or legally impossible to implement.
  • Mark questionable solutions
    Maybe, those little flaws are an opportunity for your company or clients to surpass the competitors by providing handier tools. Or perhaps they are red flags for hidden complications in a business process caused by technical or legal reasons. Investigate.

How to deliver results

Since it’s not the most sophisticated research, deliverables don’t need to be extensive as well. I suggest putting together mini-deck with your notes illustrated by the most demonstrative examples. Don’t overdo it, though: one screenshot for one point is enough.

When it fails

This method is pretty simple, which makes it entirely bulletproof. The only way to fail is to skip it. Though not documenting your research would not be the best idea either. Even if all the apps or sites of your competitors look alike and all the solutions are apparent to you, make sure you noted that for future references. In writing. With screenshots.

Tip: even if you believe this part is a default, don’t hesitate to add the report of it into the deck. Designing anything at all, I start with an exploration of the stuff, even distantly related to the topic. But I never considered presenting my findings to stakeholders until now — it felt very “backstage” to me. And guess what? I got rejections twice (once — after the onsite interview for my dream-job-at-the-time) specifically for the “lack of research” in my take-home design challenge. Ironically, my deck has results of a somewhat thorough survey in which I’ve put a lot of effort. It just didn’t match expectations. To sum up: one slide dedicated to competitors won’t take too much time to present or review. Nevertheless, it will fully cover you on the research side during the job interview or the first presentation to your clients.

User Interview and Field Observation

Names of both methods are pretty self-explanatory. I combined them as one because they serve the same purpose and require a similar approach.

When it works best

Observing users in their everyday life or directly asking them about their challenges works well in a lot of cases from designing fitness app from scratch to improving the usability of ticket kiosks. But they are especially handy for instances in which you can’t gather information any other way. For example, when you have to design a complicated system with a limited target audience.

Example: you are hired to create a new CRM system for the major car dealership. Their goal is to reduce the processing time up to 50% per deal. Client’s IT project managers have a good idea of what they are going to build. The problem is, they neither work with their current CRM nor have experience selling cars. Managers might see the car purchase as a pretty straightforward process backed with pro-con lists. But in real life, emotions are often involved in big decisions, and buyers may go back and forth several times until they finally make up their mind. Discussing the actual process with people who will use the product you design would help you truly understand their needs.

What to expect

The ultimate goal of this research is to make sense of the business process or regular interaction. Another one is to figure out what kind of issues potential users face now and how they resolve them.

Which tools to use

  • Recording device for an interview (your phone most likely would do)
  • Notepad and mobile camera for field observation.

How to conduct

When it comes to direct communication with people, preparation is everything. Remember, people are taking time from their busy schedules to help you do your job the best you can, so be appreciative. Don’t leave anything to chance. Here’s a checklist to help you to get ready.

  • Interview questions
    Define your objective and write down the list of questions that would help you direct the conversation the way you need it. Make sure they are transparent and neutral. Learn them by heart to avoid spending too much time looking through your notes. Aim to 20 questions or less: a long conversation would likely exhaust your respondents.
  • Recording device or a notepad
    Make sure you’ve charged your phone and have a fully charged power bank when heading to interview. Even for an experienced interviewer, the process may be stressful, so don’t multiply the pressure. If you are going to observe some activities, take a notepad and a camera with you. Remember to ask permission before taking pictures of people. If you are going to see some process in the working environment, consider asking permission to shoot video in advance to save you all the distraction on-site.
  • Attitude and attire
    Always be nice to people, do your best to blend in: if your interviewees follow a dress code, follow it as well. Observing how other people work, don’t interfere, be invisible at all times (lunch excluded).
  • Treats
    Be thankful, express your appreciation with little present or some other kind of reward.
    Tip: if you aim to study the complicated process, start the interview with general questions about the user’s daily routine. For example, “What do you usually do when you come to work?”

How to deliver results

Material your gather during interview and observation might be excessive yet very valuable. Create a comprehensive summary for you are unlikely ever to revisit your notes in full.

When it fails

One may ask leading questions, omit observations which don’t support the initial hypothesis or jump to conclusions too soon. In any case, the remedies are thorough documentation and strategic pausing. Record everything you see, or everything users say rather than look for solutions as you go. Give yourself some time to cool down: wait till immediate impressions fade away (and you forget the half of ideas you came up with during the experiment).

Survey

The recipe is simple: you ask questions to a sampled audience to learn about their traits, tools, and life hacks.

When it works best

Surveys are helpful if you want to learn about behavioral patterns of your target audience at large.
Example: you implement powerful feature based on loop interaction, which occurs on a particular condition. Your company announced it via social media and gathered a lot of positive feedback from users. A few mounts later, the clickstream analytics shows that almost nobody uses that feature even though it supposed to assist them in lots of unpleasant everyday situations. Given it’s a loop interaction, you can’t observe it in a lab. Data from a clickstream analysis can only unveil the problem, not to explain why it occurred. In this case, surveying users may be your best chance to figure out what’s going on.

What to expect

The goal is to spot trends or to learn about the habits of your current or potential users. Though the preparation process is somehow similar to the one you go through before face-to-face interview, the outcome is different. For this kind of research, you sample a significant number of respondents and aim to get a somehow universal big picture rather than learn about a very particular process in details.

Which tools to use

  • Google Forms
  • SurveyPlanet
  • Survey Monkey — not free, very powerful. You probably don’t need to buy a subscription just for design needs. Though it won’t hurt to ask the marketing department if they use it already and would assist you with your research.

How to conduct

  • Limit your ask
    You may want to ask users about every aspect of their behavior — I know I would — but your goal is to get them to finish the questionnaire. And it’s way more likely when you only ask 20–25 questions rather than when you ask 100.
  • Bias-proof your survey
    Preparing study, make sure you don’t ask direct questions. Spread them randomly rather than a group by theme. In the most sensitive cases, secure the integrity of answers by asking the same things with different wording mixed up with unrelated ones. That would help you to spot contradictions in replies. It’s essential to make sure users can’t figure out your topic of interest and tailor their answers. Also, avoid reading answers while the quantity of them is not representative yet. Doing so you would risk to come to premature conclusions and unconsciously ignore any new data that don’t back your initial idea.
  • Employ your connections
    If you don’t have a budget to hire an agency — given you’re reading this article, I assume you don’t — employ your social network. If you have a particular target audience like frequently flying business travelers or adoptive parents, state it clearly in the post. Don’t forget to write a little thank-you note when you’ll get enough responses. I believe 100+ is representative enough to take your task off the ground, though I wouldn’t build the whole product strategy based on this data.
  • Protect your sources
    Always enable anonymous answers and depersonalize data before presenting your report. People are more likely to participate and share your survey if they trust you. Be credible.
  • Test your experiment
    Before letting your questionnaire go wide, ask your colleagues to fill it in, and process the answers to learn if the outcome makes sense for you. Ask for their feedback and fix all the issues they pointed out. Don’t forget to exclude their answers from the set.

How to deliver results

  • Start your presentation with the conclusions you came to.
    Which traits dominate amongst your respondents? Do they have any common issues you can fix right away?
  • Show the raw data (preferably in diagrams).
  • Explain your method.
    How many responses did you get? Whom did you ask? How did you gather responses?
  • Conclude with a proposition for the next steps.
    What kind of features should the team focus on first, in your opinion? How can you fix the issues that you have found?

When it fails

  • Questions are vague or ambiguous
    Even in the same culture, people may interpret the same concepts slightly differently. Some of the “obvious” ideas may be one’s long-term personal defaults rather than common knowledge. To avoid misinterpretation, check the wording of your questions and make sure it’s as straightforward as it could be. Test-run your survey before distributing.
  • Too many questions or answer choices
    People unlikely to finish the study if it’s going to take more than 3–5 minutes of their lives. There are always more important things to take care of. Respect respondent’s time. As per options, people struggle to make a decision when too many choices are available, so they likely not to make one at all. Try to limit them to 4–6. Add “Other” to the list not to miss anything you haven’t thought about.
  • Too many open-ended questions
    They require typing and more deep thinking, which makes the process time-consuming. If people believe your questionnaire would take up too much of their time, they would likely to postpone it till they have more time. And then they instantly forget all about it. Aim to 3 open-ended questions or less and never make them mandatory to answer.
  • Choices are irrelevant
    That may happen when your mental model is very much different from your respondent’s. If they don’t have the options which reflect their position, they’d likely pick a random one. Test-run.
  • Presentation to the team is inconclusive
    If you didn’t suggest next steps after presenting findings, your stakeholders most likely would forget about your report right after lunch. The goal of every little experiment you run is to advance your product — collecting the data is one of the means for that, not the objective. Propose next steps and ask your team to contribute within a QA session after the presentation. Write down their ideas and send meeting notes alongside the initial deck after the meeting.

That’s it on preliminary research. Best of luck with your creative process!

Previously on Testing fast and cheap: “Introduction: defining goals, picking the right method, and planning the experiment.”
Up next “Design iterations: the proof of concept.”

--

--