Designing for an at-risk world

Shayma works as an activist with Libyan Women First, an organization focusing on increasing women’s political participation, economic empowerment in Libya, and advocating against gender-based violence. Like many of us, she relies on technology to do her job. Unlike most of us, Shayma faces daily threats of violence in the real world and is targeted by advanced malware and phishing attacks, as well as malicious online intimidation campaigns specifically designed to silence her.

As designers working in tech, we often design for people like Shayma in at-risk situations, even if we don’t know it. Consider, for example, account hacking. In many cases, a hacked account represents an inconvenience or, depending on the data involved, a risk of identity theft. Due to the political situation in her country and the nature of her work, Shayma’s risk could be higher. It’s therefore imperative to ensure that people in at-risk situations, like Shayma, can be helped by technology when they need it most.

But what does that term “at-risk” mean? At Jigsaw, we consider “at-risk” situations to be when bad actors or situations generate exceptional risk for people. Some people may face at-risk situations because of who they are (a minority), what they do (an activist or journalist), where they live (a conflict zone or abusive household), or what they share (someone who promotes a particular ideology on social media). Over the course of our digital lives, many of us, including many of you reading this post, will be at risk, if you aren’t already.

Designing for people in at-risk situations is hard. While we try our best to address the needs and preferences of the vast majority of users, none of us have the capacity to personally witness the full diversity of the world’s situations, so there will always be contexts that designers can’t foresee. For example, unless you’ve had experience on the ground, it can be challenging to channel the experience of someone using your product in a conflict zone. And the default setting for 99% of people may not be the preferred one for the remaining 1%; it can be impossible to know when research and design are “done.” These intrinsic limitations make it harder to serve users in at-risk situations.

Designing for the high-risk user

While the typical company designs for the vast majority of users, at Jigsaw, we work to understand the needs of people in at-risk situations, and build tools to try to address some of the specific situations they face. For example, we have worked to provide access to the open internet, to protect against DDoS attacks, and help to detect phishing attacks. We meet with activists, journalists and other frequently targeted groups, and we collaborate with partners in Google to develop frameworks for understanding what at-risk users need, and how designers can help accommodate those needs.

When designing for at-risk situations, the first step is to map out the goals, stress levels, and expectations of users throughout their at-risk journey. Knowing the stages of the user journey can inform the design of a piece of technology.


We often call the first user stage Prevention, which refers to what people do in early moments to minimize their risk down the road. For example, we have met people from high-crime parts of Mexico who think that sharing their location can increase their risk of being victimized. Technology can help in this context by making it easy to configure settings to disable any tracking or sharing of their location.

As designers, anything that could impact privacy and security — settings, accepting or declining app permissions, and terms of service — are opportunities to help people navigate risky, real-world situations. Most people aren’t particularly stressed by the choices they make at this stage of the journey — they just want to understand the possible consequences of their decisions — so there’s often time to carefully weigh tradeoffs.


The second stage is Monitoring. This is when the stress level of the person is probably low (they are just checking things), and they likely have ample time to review links to read more pages. Technology can help by making it easy to ensure that the settings are appropriate for the person’s context. In the Mexican example, users may want to ensure that their GPS information (or any other location based services) is turned off. This information should be easily accessible and clear. In some circumstances, monitoring may also be automatically initiated by the app or service. For example, apps can push the person a notification saying “New settings available.”


The third stage is Crisis, or the “Oh no!” moment. Shayma, the activist from Libya, was in this stage when she discovered that bad actors had hacked her email account. At that moment she was extremely stressed, and only wanted to understand what was happening and how to make herself safe. During the crisis stage, technology can help people quickly report what is happening and get support.

Since stress is spiking during the crisis stage, it is often important to “cut through the clutter.” Technology can help by providing simple, clear next steps, such as providing information on emergency hotlines or other self-help when available.


The last stage is Recovery, which starts right after the crisis stage has been resolved. This is when a person starts fixing broken things, such as restoring an account or repairing an online identity. Shayma had to create new accounts, which was somewhat stressful, but this time she was armed with information to increase her security online. During this stage, technology can help by being empathetic and providing ways to streamline the process of fixing or rebuilding.

Ultimately, Jigsaw’s engineers and friends on the Google security team helped to address Shayma’s crisis and recovery moments. User engagement and education with at-risk communities is an important element to help make users aware of their options for protecting themselves when they are using technology in high risk situations. Yet any of us could be in at-risk situations, and not always with personal tech support; design therefore has an important role to play as well. Our hope is to help make it easier to consider the situations Shayma and others might confront before they face them — within our own technology, with our collaborators in Google, and across the design and development communities at large. Good design delights us. By planning for at-risk situations in our product design, we can also ensure that good design helps during the most stressful moments.

Izzie Zahorian is a Seattle-based designer and researcher who enjoys listening, traveling, and learning from others. She is currently hiking the Washington PCT to Canada.

Dalila Szostak is the lead experience researcher at Jigsaw and currently very jealous of her dear colleague and friend Izzie.