Harpoon & Periscope UX Case Study

Navi
The Cryptonomic Aperiodical

--

Tezos user tools have come a long way this past year. Despite the chaos of the global pandemic in 2020, our space continued to mature with the addition of new tools, and the refinement of existing ones. At Cryptonomic we recently released two new data analytics-based user tools — Harpoon and Periscope. As the resident UX designer 🙋, I think it’s a great time to start an open discussion about how we can improve the user experience of tools in our ecosystem through focused user testing.

I designed and conducted a user research study to understand the motivations and values of key Tezos user segments, as well as test the usability of the first iterations of Harpoon and Periscope. The research was not exhaustive, but it was incredibly rich with insights. The purpose of this case study is to give a high-level overview of what we learned about Harpoon and Periscope from the usability testing portion of the study. We are all passionate proponents of open-source knowledge sharing at Cryptonomic. Accordingly, I hope to follow this article with a deeper meta-analysis of the methods used, insights uncovered, and suggestions for how a UX approach can help those building and designing tools on Tezos.

Objective

We are mapping out the motivations and values of key Tezos user segments, as well as testing the usability of our user tools — Harpoon and Periscope.

Target Users

  • Involved users who follow Tezos closely
  • New entrants to the Tezos space
  • Tezos Bakers
  • Blockchain Developers

Methods

  • A screener survey was sent out so we could recruit a balanced representation of Tezos user segments.
  • User interviews to understand the values, motivations, and current workflows of users.
  • Usability testing using task scenarios and the think-aloud protocol.
  • Affinity Mapping insights from interviews and usability tests.

Harpoon Usability Testing Key Insights

Average Usability Rating: 7/10

  • Inspired exploration: Users were interested or surprised by something new they learned on Harpoon which would often lead to exploring related questions. This validates that we are showing data that has value.
  • Effectiveness of visual signifiers: Using varying opacities of colors as a heatmap was effective in places like the performance table so users could easily scan the data. The colors and symbols used in the rewards table to indicate positive, negative, or neutral numbers received mixed interpretations since there was no key for their usage.
  • Tezos terminology: Users were unfamiliar with terms like “blocks stolen.” This led to skipping over and misinterpretation of the data.
  • “Whose data am I looking at and why?”: Users expressed that they had difficulties distinguishing baker and delegator addresses since not all bakers have registered names. Overall, it was also unclear to users that the current baker on the page was the baker from the latest block.
  • Baker grade calculation: 5 out of 5 users felt that the grade given was either appropriate or that bakers should be graded more harshly based on the performance table data. One user noted that a stricter grading curve feels more reliable. It should be noted that there was confusion about how to compare large bakers that have a lot of data to smaller ones.
  • Baker grade calculation formula: 4 out of 5 users expressed that they appreciated the transparency of sharing a link to the grade calculation formula but felt it wasn’t intuitive to understand on Desmos.
  • Interpreting performance data requires context: Users had difficulty interpreting the performance table data without having comparative metrics from other bakers readily available to look at alongside the one they were asked to evaluate. The hierarchy of the page puts the performance and rewards table at the bottom, forcing the user to scroll and hide the context for who they are viewing the data for.
  • Choosing a baker is more than a numbers game: 5 out of 5 users expressed that they value bakers based on their reliability and values. When selecting a baker some of the things delegators look for are baker websites, communication channels, other people’s experiences, and whether they are active participants in Tezos governance.
  • Rewards Calculator: The rewards calculator often went overlooked since it’s at the bottom of the page. Most had difficulties understanding how the calculator worked because they were confused by the editable fee and payout delay fields.

Periscope Usability Testing Key Insights

Average Usability Rating: 8.8/10

  • Inspired exploration: Similar to Harpoon, users were often surprised by the data they found. Most didn’t realize how drastic the delta was between large bakers and small bakers.
  • Interaction with data: Users expected to be able to click into the data and follow up on addresses they are seeing.
  • Data is meaningful when it tells a story: Users expressed interest in the graphs that showed a trend or something out of the ordinary.
  • An ecosystem of data exploration tools: Users expressed that the graphs on Periscope, data from Harpoon, and queries from Arronax were extremely valuable when paired. Currently, these tools are linked but there’s friction if the user is trying to follow a question and navigate between them.

Next Steps

Overall, I think it’s important we look at how we can equip users with full context about the value these tools provide as well as consider that they are often windows for new users to learn about Tezos itself. Our next steps include defining a problem statement that speaks to the data gathered from the research and then it’s time to brainstorm solutions through collaboration and design ideation. Thank you to all those who participated in this first round of tests! We’re excited to continue to reach out to the community to include our users in our development process.

--

--