How usability testing my UX portfolio made me a better researcher

Drew Long
Bootcamp
Published in
6 min readJul 2, 2021

“He’s put design research as a skillset, but I don’t think he knows much about design research”.

This comment came in a recent usability testing session of my UX portfolio with a top-tier UX professional and mentor from ADPList. This was tough feedback, but it was exactly what I needed to hear. As a project, usability testing improved my portfolio and expanded my professional network. But more importantly, it afforded me valuable experience ‘thinking like a UX researcher’.

Background

It all started with a well-defined problem to be solved: my lack of success on the job market as an emerging UX researcher and designer, and no insight as to why. I had recently completed a UX bootcamp, and I designed my portfolio to showcase my work and land a job. However, the tremendous effort of creating the portfolio made me hesitant about getting feedback that might change it. Then I realized that this was a great opportunity to flex my muscles and gain some practical experience.

I designed the research around the excellent guidance in David Travis and Philip Hodgson’s essential book, “Think Like a UX Researcher”.

Research question

What does a UX hiring manager or recruiting professional learn about me as a candidate from browsing my portfolio?

I chose this question, because it: a) asks something important, b) is focused and specific, and c) is capable of being tested.

Usability Testing

With two types of target user defined, I developed a usability test plan around two scenarios.

Scenario 1: Corporate Recruiter; Scenario 2: Senior UX Designer

Sessions were structured to include: introductory questions about the participant’s background, the scenario and tasks, and follow-up questions.

Listing of Tasks and Follow-up Questions

On to recruitment, and a tremendous shout out to my participants, most of whom I recruited from ADPList and the Research by Learners Slack community! ADPList is a platform where working UX professionals volunteer their time as mentors to emerging design professionals. And Research by Learners is an encouraging and helpful forum for researchers at all levels of experience. Please check out both of these resources, especially as a junior UX professional.

I am so grateful to the senior UX and recruiting professionals who shared their time and honest feedback for this project.

Participants: Eve Cuthbertson, Radhika Ramadoss, Beant Kaur Dhillon, Tara Butler, Andrea Towe, Pratik Joglekar

I completed usability testing over about a week and began immediately to analyze the results.

Start with the Data

It was crucial to begin analysis by simply organizing the data without interpreting or drawing conclusions. The raw data consisted of my observations (e.g., user quotes, actions, pain points, goals) and took the form of many post-it’s on a virtual white-board in InVision.

I organized the observations into logical groups, according to the context or topic. There were four primary groupings (with many sub-groups) as shown in the image: website, case study feedback, overall impressions, writing style.

Visual organization of the data by topic: website, case study feedback, overall impressions, writing style

The overall impressions were particularly important, as they speak most directly to my research question. Some of them were tough. I include them here to show I very much took these to heart. The following came in response to the question, “What do you remember about this candidate?”

“That’s a good question.” (Participant 1)
“It’s a good question. I’m stuck on answering it.” (Participant 2)
“Is he a designer or researcher?” (Participant 3)

Participant comments: “Overall impressions” — These were great feedback, not only for the portfolio, but for me personally.

Generate Insights: What is the Underlying Problem?

From here, it was time to interpret the data and generate insights. Insights are conclusions you’ve drawn from the data, stated succinctly and provocatively, like a newspaper headline. I clustered the data into smaller groups focused on the most important issues, as well as the “lowest hanging fruit” in terms of easy fixes.

My analysis revealed the following key insights:

  1. Portfolio lacks memorable value proposition
  2. Imprecise wording, titling set false expectations of candidate qualifications
  3. Case studies are cluttered, have poor readability.
  4. Users struggle with contact information, resume
Observations are clustered based on the insight that can be drawn, stated concisely in a single sentence.

Testing also reveals some interesting smaller issues — for example, the “email” link. I had a link in my footer — “email” — that opens a message in your default email application to my professional email address. That’s all it says: “email”. Four out of six participants indicated they don’t have an email client they use to open such links. And in fact, I don’t either! Like they do, I usually copy and paste the address into an open email. So, there’s a (small) mistake I never should have made. But I never would have thought of this if not for the testing.

Develop Hypotheses and Propose Solutions

I considered these four major problems and made my best guess, or hypothesis, about the explanation for each.

For example, one major issue was that some participants misinterpreted my qualifications and level of expertise. This set false expectations for my work on the portfolio and, rightly, made a bad impression. My hypothesis was that the wording in multiple places on the website (e.g., the hero section, project titles, sub-headings) was inaccurate or inadequately descriptive.

The hypothesis then suggests a solution(s), which should be testable.

In the previous example, I proposed the following changes:

  1. Update the wording, labeling of hero section and projects (reflecting these changes in case studies as well).
  2. Re-assess skills on About page for possible revision.

In proposing solutions, the goal was to do the least possible. These smaller changes also reflect the iterative iterative approach used by agile teams in the real world. The complete list of insights, hypotheses, and proposed changes are shown in the table below.

Table capturing insights, hypotheses, and simplest changes for implementation

Create Solutions

I returned to my portfolio to implement the solutions clearly stated in the analysis. Some of the changes I have already made include:

  • Adding top-level navigation items for “Contact” and “Resume”, increasing visibility and accessibility of these high priority areas
  • Adding “emerging” to my hero section content, to disambiguate my level of experience
  • Revising the tags I associated with each case study

Conclusion and Next Steps

This was a great project. It led to networking, improvements to my portfolio, and a lot of learning and practical experience around user research. The next steps include further upgrades. However, one major consideration is how much to improve this website — which has significant technical limitations (made on Notion) — versus creating a new, more powerful website built from scratch?

Here’s my advice (to myself, maybe useful to you): Rather than obsess irrationally over a portfolio website that never feels ‘finished’, make an informal cost-benefit analysis: How much benefit will I get from a new site — versus — How much will it cost, in terms of time, lost productivity on new projects? You know what, I think I’ll take the iterative approach for now.

Thanks for reading, and thank you to the awesome professionals who shared their time, mentorship and invaluable feedback.

Eve Cuthbertson (UX Researcher, Questrade), Radhika Ramadoss (Associate Director, UX, PayU), Beant Kaur Dhillon (Senior User Researcher and Usability Consultant), Tara Butler (Senior UX Strategist, Purple Rock Scissors), Andrea Towe (Career Coach, CareerFoundry), and Pratik Joglekar (Senior Product Designer, Mass Mutual)

--

--

Drew Long
Bootcamp

Freshly minted User Researcher and Designer finding my footing after a career in K-12 education