UX Case Study: CRISPR Screening Workflow

Parul Bindal
Elucidata
Published in
7 min readFeb 25, 2019

“ The story starts from microbes and molecules. It starts with pure curiosity about an odd pattern that was seen in the DNA of some bacteria, and it has led to one of the most powerful breakthroughs in modern genetics.”
- Sean Eddy

1. Introduction to CRISPR Screening

In simple words, CRISPR (Clustered Regularly Interspaced Short Palindromic Repeats or repeated bits of DNA) is a gene editing technology. To understand CRISPR Screening, think of DNA as a set of instructions. CRISPR/Cas9, a protein complex, is used to modify those instructions. CRISPR Screening helps in identifying whether there is a significant relationship between the modified instructions and their effects. Before CRISPR Screening Workflow, PollyElucidata’s software offering already had an app called Polly MAGeCK VISPR which was built on top of a CLI (Command Line Interface) based open source project called MAGeCK VISPR. Polly MAGeCK VISPR captures a part of the user’s process — Quality Control and Visualization of CRISPR Screens.

Fig 1. Quality Control UI in Polly MAGeCK VISPR App
Fig 2. Visualizations and tables of CRISPR Screens in Polly MAGeCK VISPR App

2. Discover and Define

2.1 Problem and Concepts

The first step was to understand the science and the steps users take before and after the use of MAGeCK VISPR which could help us understand their goals and end-to-end flow. Primarily, we started out by reading research papers of this domain followed by papers on MAGeCK VISPR specifically. Using and analyzing MAGeCK VISPR on Polly critically helped us in figuring out the possible challenges the existing users might be facing.

Challenges in MAGeCK VISPR

  1. Does not cater to the end to end user flow
  2. Overall UI not very intuitive
  3. One of the highly requested features was the ability to compare multiple (as many as 9 comparisons!) at the same time
  4. Does not align with our UI guidelines

Once we had the detailed workflow of a typical user in place from this research, we got it validated from our expert Data Scientists aka our Internal Users.

2.2 Identifying the requirements

Though we were able to think of major requirements from reading the research papers, talking to our internal users and analyzing Polly MAGeCK VISPR app, we needed answers to some of the questions which would help in making granular decisions.

User Research

A part of user research was to ask the right questions to understand the possible nuances clearly. Some of the questions are listed below:

  1. Do you use any tools to get the list of interesting genes?
    a) If yes, which ones?
    b) If No, do you hard code on Jupyter Notebook?
  2. How often do you follow this flow?
  3. How many days does it take for the whole workflow on one dataset?
  4. How many comparisons would you like to compare?

Primary Stakeholders

  • Elucidata has a team of ‘Internal Users’ who provide the service to our external customer base and that’s why they are also the potential users of Polly. We test our ideas actively with this team to know the sustainability and viability of our notions.
  • External Users are typically Bioinformaticians, Scientists, Data Analysts working in the field of bioinformatics.

Secondary Stakeholders
Developers and Product Team

2.3 User flow

High-level User Flow

This High-level User flow helped us to align our efforts with motivation and expectation of users.

Fig 3. High-level User Flow

The Output data here will be a list of genes user is interested in, which have been narrowed down from a large pool of genes using this Workflow.

Detailed User Flow

Fig 4. Detailed User Flow

3. Development and Delivery

3.1 Sketching

After we had a better understanding of user goals and expectations, we chose to make sketches to iterate ideas faster.

Fig 5. Sketches

3.2 Wireframes

Hi-fidelity Wireframes

We tried to focus more on detailed functionalities rather than what it looks like.

Fig 6. Hi-fidelity Wireframes

3.3 Hi-Fidelity Designs

Once the wireframes were evolved and approved by different stakeholders, we created high-fidelity design mock ups. Here, we have highlighted the screens that we designed as a solution to our problems instead of showcasing all the screens.

Representation of large data in a meaningful manner

Fig 7. How are graphs shown when multiple comparisons are selected

In simple words, the end goal in Quality Check (QC) is to check if some values (For example, GC Content, Base Quality, etc) are as expected or not. We have shown this information using the red crosses and green ticks for multiple comparisons where red means they are below and green means they are above a particular threshold. These values (red, green and yellow) are also reflected in the graph on the right so that they can interpret their data at one glance.

Information Heavy Screens

Fig 8. How tables look when multiple comparisons are selected

On the screens, as shown above, there can be many tables. One table (represented in one color) is related to one comparison. Option to Show or Hide and changing the order of the tables gives the user a chance to select which tables and in which order he/she would like to see.

Non-standard workflow of different users

Fig 9. Dashboard
  • Using Modules
    There is no linear pattern to the workflow. All the modules are independent identities. So, if a user happens to use one method and not the other, he/she can skip it completely. These modules are resizeable and movable and can be minimized.
  • Using Pipelines
    There are multiple open source pipelines available in this domain. MAGeCK VISPR is one such pipeline. Different academic labs/ companies use different pipelines. So, to cater to those needs, we give them the option to select one of the pipelines in the beginning. Depending upon which pipeline they choose, the design of that particular module will change.

Multiple Comparisons

The number of comparisons is proportional to the graphs and tables on the screen. To cater to this need where the user might end up looking at a large number of comparisons, we used the following methods -

  • Carousel
    Instead of showing all the graphs on the screen, we used carousels.
  • Show/Hide
    As mentioned above, Show/Hide helped us to allow the user to choose which elements they would like to see on the screen acknowledging the limited space on one page no scroll screen.
  • Grouping elements of the same category
    For example, we grouped all the comparison related actions like — adding, deleting, seeing the logs, selecting the base comparison and selecting multiple comparisons under one umbrella called Manage Comparisons.

3.4 Prototyping

Our UI design tool of choice is Sketch. To invite valuable feedback from our users, we exported these High-fidelity User Interfaces in Invision for prototyping.

3.5 Validation and Testing

After the feedback from the Product team and internal users, clickable prototypes were shared with external and internal users.

Demographics of Internal and External Users (combined) who participated in the User Interviews were as followed -

  • Gender — 3 Females + 4 Males
  • Location — India (Face to Face), US (Remotely)
  • Age — 25 years to 35 years

It was a mix of Think Aloud and Walk through the prototype method. We had a script ready with a set of questions to ask at the end of the interview which would determine the features to take up in Minimum Viable Product (MVP). Towards the end, a document along with the prototype was shared with the external users where we asked them questions to understand what they thought was important and what they thought wasn’t quite as important.

3.6 Shipping to developers

Only a part of your work is done when you export your designs. Working with developers to verify the quality of development is the most important step. We used Invision to handover the designs to developers where they could inspect. Developers dissected the designs and asked the questions about the granular details which a designer could have overlooked among a large number of screens or may have made preemptive assumptions. After the design review by developers and few small iterations, designs were ready to be implemented! This is the most exciting step for new designers like me because of this one reason — You can see the impact of what you designed in real lives!

4. Takeaways

  • Spend more time on user research: This helps in figuring out the granularities in user flow than the actual design. We ended up iterating on the UI multiple times which could have been avoided if screens could have frozen at High-Fidelity Wireframe level after taking feedback from Product, Developers, and Internal Users. But iterations after external design review cannot be avoided and is one of the most important steps to get it “just right”.
  • Keep developers in the loop from the beginning: It helps in keeping a check on the feasibility of what you are designing.
  • Understand Science and Context: Step back from the role of designer and understand the science behind whatever is to be designed. It will help in identifying the edge cases which might pop up later and might lead to significant changes in the design.

Check out the finished product yourself! Have a quick demo here — Polly

Resources — https://news.harvard.edu/gazette/story/2018/05/crispr-pioneer-jennifer-doudna-explains-gene-editing-technology-in-prather-lectures/

Also published on blog.elucidata.io

--

--

Parul Bindal
Elucidata

Lead Product Designer @Freshworks. Prev @MakeMyTrip