Back in 2013, Erika Hall, founder of Mule Design, wrote “Enough Research”. This book consists of 9 chapters which is distills her experience into a brief cookbook of research method. Although this book is not so thick (only have 163 page), you’ll find research reasons, the basics, research processes and research methods that can help you to be a good researcher.
The list of chapters in this book:
- CHAPTER 1 — Enough is Enough
- CHAPTER 2 — The Basics
- CHAPTER 3 — The Process
- CHAPTER 4 — Organizational Research
- CHAPTER 5 — User Research
- CHAPTER 6 — Competitive Research
- CHAPTER 7 — Evaluative Research
- CHAPTER 8 — Analysis and Models
- CHAPTER 9 — Quantitative Research
And now, these are the summary per chapter in “Just Enough Research”:
CHAPTER 1 — Enough is Enough
What research is
- Research is simply systematic inquiry.
- Personal research — Finding information for yourself (Google searches).
- Pure research — Is carried out to create new human knowledge based on observation or experimentation.
- Applied research — Borrows ideas and techniques from pure research to serve a specific real-world goal.
- Design research — Studying about design itself or the end users.
What research is not
- Not asking people what they like (or hate).
- Not a politcial tool.
- Applied research is not science.
CHAPTER 2 — The Basics
Who should do research? Everyone!
Find your purpose
- Generative or exploratory research: “What’s up with…?”
- Descriptive or explanatory: “What and how?”
- Evaluative research: “Are we getting close?”
- Casual research: “Why is this happening?”
The research processes
- The objections you will hear
- We don’t have time.
- We don’t have the expertise or budget.
- The CEO decides what we do anyway.
- One methodology is better (quant or qual).
- You need to be a scientist You need infrastructure (equipment and a special room).
- It will take too long.
- You can find our everything you need once we launch the beta.
- We know everything already.
- Research will change the scope.
- Research will impede innovation.
2. Actual reasons behind the objections
- I don’t want to bother.
- I’m afraid of being wrong.
- I’m uncomfortable talking to people.
Research in any situation
“Poor user experiences inevitably come from poorly informed
design teams.” — Jared M. Spool, founder of user Interface Engineering
Contexts and situations where you might be doing research:
- Client services agency.
- In-house at a big company.
- In-house at a startup.
- Working with an agile development team.
Just enough rigor
- Cover your bias
- Design bias — The design of your study has inherent biases so be sure to compensate for them.
- Sampling bias — Be mindful of who you select to participate.
- Interviewer bias — It’s hard to be neutral.
- Sponsor bias — Who’s paying for everything?
- Social desirability bias — Everyone wants to look their best.
- The Hawthorne effect — The behavior of the people you are studying might change just because you are there.
2. The ethics of user research
- The project as a whole — Is it ethical?
- The goals or methods — Are you tricking your participants?
- Consent and transparency — Informed consent is the rule.
- Safety and privacy — Don’t do telephone interviews with someone who is driving.
3. Be skeptic: Ask a lot of questions! Get comfortable with knowing and working with your own limits.
4. Best practices
- Phrase questions clearly.
- Set realistic expectations.
- Be prepared.
- Allow sufficient time for analysis.
- Take dictation.
5. How much research is enough?
- Avoiding unnecessary research
- That satisfying click — There is no answer to the question of enough.
CHAPTER 3 — The Process
- Define the problem — A useful research study depends on a clear problem statement. Base your statement on a verb that indicates an outcome, such as “describe,” “evaluate,” or “identify.”
- Select the approach
3. Plan and prepare for the research
Identify the point person who will keep track of everything. Sketch out a plan in terms of time, money, people involved and their roles, necessary materials, etc. You don’t have to get it 100% correct but it helps to have it all laid out.
Recruiting — Recruiting is simply locating, attracting, screening, and acquiring research participants. There’s no draft, so you have to recruit.
A good research participant:
- Shares the concerns and goals of your target users.
- Embodies key characteristics of your target users, such as age or role.
- Can articulate their thoughts clearly.
- Is as familiar with the relevant technology as your target user
Recruiting isn’t fun but it gets easier with practice.
- Use the web to your advantage to cast a wide net.
- Use a screener to make sure that you’re recruiting the right
Elements of a screener:
- What are all of the specific behaviors you’re looking for?
- What level of tool knowledge and access do participants need?
- What level of knowledge about the topic (domain knowledge) do they need?
Collect the data — Find your research subjects. Conduct your interviews. Do your feld observation. Run your usability tests.
- Materials and tools — Use what you already have frst, and go for familiar tools.
- Interviewing — A simple interview remains the most eﬀective way to get inside another person’s head and see the world as they do.
- Usability testing — The goal is to determine to what extent the product or service as designed is usable.
Analyze the data — Once you have collected the data, gather it all together and look for meaningful patterns. Turn the patterns into observations, and from those, recommendations will emerge.
- Get everyone involved.
- Structuring an analysis session.
Report the results
CHAPTER 4 — Organizational Research
Researching an organization is very similar to traditional user research and can be incredibly helpful to interactive design and development projects.
Who are stakeholders?
- Defined as “those groups without whose support the organization would cease to exist.”
- Executives, managers, subject matter experts, staff in various roles, investors and board members.
“Interviews with project stakeholders oﬀer a rich source of insights into the collective mind of an organization. They can help you uncover areas of misalignment between a company’s documented strategy and the attitudes and day-to-day decision-making of stakeholders. They can also highlight issues that deserve special consideration due to their strategic importance to a business.” — Steve Baty, “conducting successful Interviews with Project stakeholders”
What interviewing stakeholders is for:
- Neutralizing politics.
- Better requirements gathering.
- Understanding organizational priorities.
- Tailoring the design process.
- Getting buy-in from said stakeholders.
- Understanding how your work affects the organization.
- Understanding workflow.
Types of interviews:
- Individual interviews.
- Group interviews.
- Email interviews.
- Interview structure
- Introduce yourself.
- Explain the purpose of the meeting.
- Explain how the interview data will be shared.
- Be sure that people can speak freely.
2. Dealing with hostile witnesses
- Do your research ahead of time to predict if a stakeholder might be combative.
- Remain calm and confident (practice with members of your team beforehand).
3. Documenting interviews.
What to do with stakeholder analysis
What you should include in this documentation:
- Problem statement and assumptions.
- Success metrics.
- Completion criteria.
- Risks, concerns, and contingency plans.
- Verbatim quotes — Very valuable but try to anonymize them.
- Workflow diagrams (see graphic).
CHAPTER 5 — User Research
When we talk about user research as distinguished from usability testing, we’re talking about ethnography, the study of humans in their culture. We want to learn about our target users as people existing in a cultural context. We want to understand how they behave and why.
Everything that factors into context
- Physical environment
- Mental model
Assumptions are insults
Getting good data from imperfect sources
What is ethnography?
The fundamental question of ethnography is, “What do people do and why do they do it?” In the case of user research, we tack on the rider “…and what are the implications for the success of what I am designing?
The four Ds of design ethnography
- Deep dive — Get to know a small but sufficient number of representative users very well
- Daily life — It’s of limited utility to learn how people behave in your conference room so go to where they live and work.
- Data analysis — Systematic analysis is the difference between actual ethnography and just meeting interesting new people at a networking event.
- Drama! — Lively narratives help everyone on your team rally around and act on the same understanding of user behavior
- Interview structure: Three boxes, loosely joined
- Introduction: Say hello, express gratitude, talk about why you’re there,
review demographic information
- Body: Ask open-ended questions, follow up or probe as necessary, allow
pauses and silences.
- Conclusion: Express gratitude again, ask if they have questions, talk about
3. Conducting the interview
- Don’t forget to breathe
- Practice active listening (Nod and say “mm-hmm” but pay close attention)
- Keep an ear out for vague answers
- Avoid talking about yourself
Contextual inquiry is a deeper form of ethnographic interview and observation. It is particularly useful for developing accurate scenarios, the stories of how users might interact with potential features.
Things to keep in mind
- Travel — Allow plenty of time to get to the site and set up.
- Get situated — Find a comfortable spot that allows you to talk to the participant without interrupting their normal routine.
- Interview — Establish trust and learn about what you will be observing. Find out when it will be least disruptive to interrupt and ask questions.
- Observe — It’s a show. You’re watching. Note everything in as much detail as possible. The relevance will be apparent later. Pause to ask questions. Stay out of the way.
- Summarize — Conclude by summarizing what you learned and asking the participant to verify whether your observations were correct.
Focus group — Focus groups are the antithesis of ethnography.
CHAPTER 6 — Competitive Research
You need to know not only who your competitors are from the perspective of the business (that’s generally obvious) but who competes for attention in the minds of your target users.
SWOT analysis — Plotting out strengths, weaknesses, opportunities, and threats
Competitive audit — Once you have identifed a set of competitors and a set of brand attributes, conduct an audit to see how you stack up.
Brand audit — Your brand is simply your reputation and those things that signify your identity and reputation to your current and potential customer.
Here are the questions you need to ask about your brand:
- Value proposition
- Customer perspective
Name — The name is the single most important aspect of a brand.
Logo — The logo is simply the illustrative manifestation of your brand, which can take several forms: wordmark, bug, app icon, favicon, etc.
Usability-testing the competition — Just what it sounds like. Take those usability testing skills and apply them to someone else’s product or service.
CHAPTER 7 — Evaluative Research
Evaluation is assessing the merit of your design. It’s the research you never stop doing. There are several ways to go about it, depending on where you are in the project.
Heuristic analysis — “Heuristic” in English simply means “based on experience”; a heuristic is a qualitative guideline, an accepted principle of usability. The method is very simple: evaluators (at least two or three, ideally) individually go through a site or application with a checklist of principles in hand and score the site for each one. Nielsen’s ten heuristics.
Usability testing — Usability is the absolute minimum standard for anything designed to be used by humans. If a design thwarts the intended users who attempt the intended use, that design is a failure from the standpoint of user-centered design. Usability is a quality attribute defned by five components by Nielsen.
Do the cheap tests first and the expensive ones later:
- Start with paper prototypes and sketches
- Look at and test competitor’s products
- Test at every stage (as much as time will allow)
- Preaparing for usability testing
What you need:
- A plan (What are the tasks that you need to cover? Include the tasks as part of a larger scenario so the user can understand the context.)
- A prototype or sketch.
- Four to eight participants of each target user type based on personas (ideally) or marketing segments.
- A facilitator.
- An observer.
- One or more methods of documentation.
- A timer or watch
Recruiting — Recruiting for usability testing is substantively the same as for ethnographic interviews.
Facilitating — A good facilitator is personable and patient.
Observing and documenting — Even if you are set up to record, it’s very important to have a second person observing the tests and taking notes.
Eye-tracking — Eye-tracking measures where someone is looking, how long, and in what direction.
2. Analyzing and presenting test data
The aim of usability testing is to identify specific significant problems in order to fix them. The outcome is essentially a ranked punch list with a rationale.
CHAPTER 8 — Analysis and Models
Analysis involves a few simple steps:
- Closely review the notes.
- Look for interesting behaviors, emotions, actions, and ver- batim quotes.
- Write what you observed on a sticky note (coded to the source, the actual user, so you can trace it back).
- Group the notes on the whiteboard.
- Watch the patterns emerge.
- Rearrange the notes as you continue to assess the patterns
Affinity diagram — Clusters of related observations. Each cluster then lets you extract insights and make recommendations.
- Write down observations
- Create groups, noting all stated and implicit goals
- Identify next steps
Creating personas — A persona is a fictional user archetype — a composite model you create from the data you’ve gathered by talking to real people — that represents a group of needs and behaviors.
Mental Models — A mental model is an internal representation of something in the real world — the sum total of what a person believes about the situation or object at hand, how it functions, and how it’s organized.
Creating a mental model
- Do user research.
- Make an affinity diagram.
- Place affinity clusters in stacks representing the user’s cognitive space to
create the model. These groups will include actions, beliefs, and feelings.
- Group the stacks around the tasks or goals they relate to.
Task Analysis/Workflow — Task analysis is simply breaking one particular task into the discrete steps required to accomplish it.
CHAPTER 9 — Quantitative Research
Qualitative research methods such as ethnography and usability testing can get you far, but you still won’t get everything right. Once your web site or application is live, then you have quantitative data to work with.
Optimizing a design is the chief aim of quantitative research and analysis
Conversions — A user is said to convert any time they take a measurable action you’ve defined as a goal of the site. Measuring the conversion rate for each of these will indicate the success of that particular path, but not how each type of conversion matters to the success of the organization itself. That is a business decision.
Analytics — Analytics refers to the collection and analysis of data on the actual usage of a website or application to understand how people are using it. Over half of the world’s websites have Google Analytics installed.
Some of the basic stats to look at include:
- Total number of visits.
- Total number of pageviews.
- Average number of pages per visit.
- Bounce rate (the percentage of people who leave after viewing one page).
- Average time on site.
- Percentage of new visitors
Split Testing (a.k.a. A/B Testing) — This method is called split testing because you split your trafic programmatically and randomly serve diﬀerent variations of a page or element on your site to your users.
How to do it:
- Select your goal.
- Create variations.
- Choose an appropriate start date.
- Run the experiment until you’ve reached a ninety-five percent confidence level.
- Review the data.
- Decide what to do next: stick with the control, switch to the variation, or run more tests
Form questions. Gather data. Analyze. One sequence, many approaches. Get started (right now!) and encourage you to develop a research habit wherever and however you work.
Thank you for reading! If you enjoyed this summary or have any feedback, I’d love to hear from you. You can give your response on below this post or email me at firstname.lastname@example.org