Surveying Vulnerable Populations: A Case Study of Civil Society Organizations

Center for Long-Term Cybersecurity
CLTC Bulletin
Published in
15 min readJul 7, 2020

This paper, presented at the CHI 2020 Networked Privacy Workshop, was authored by a team of researchers — Nikita Samarin, Alisa Frik, Sean Brooks, Coye Cheshire, and Serge Egelman—affiliated with the UC Berkeley Center for Long-Term Cybersecurity (CLTC) and the International Computer Science Institute (ICSI).

In the paper, the researchers discuss their ongoing study of how employees in civil society organizations (CSOs) perceive the risk of diverse cybersecurity and privacy threats. The researchers describe their survey-based methodology, as well as challenges they encountered, with a goal to inform other practitioners and researchers working on improving the security and privacy of CSOs.

Abstract

Compared to organizations in other sectors, civil society organizations (CSOs) are particularly vulnerable to security and privacy threats, as they lack adequate resources and expertise to defend themselves. At the same time, their security needs and practices have not gained much attention among researchers, and existing solutions designed for average users do not consider the contexts in which CSO employees operate.

As part of our preliminary work, we conducted an anonymous online survey with 102 CSO employees to collect information about their perceived risks of different security and privacy threats, and their self-reported mitigation strategies. The design of our preliminary survey accounted for the unique requirements of our target population by establishing trust with respondents, using anonymity-preserving incentive strategies, and distributing the survey with the help of a trusted intermediary. However, by carefully examining our methods and the feedback received from respondents, we uncovered several issues with our methodology, including the length of the survey, the framing of the questions, and the design of the recruitment email.

We hope that the discussion presented in this paper will inform and assist researchers and practitioners

Introduction

Researchers and practitioners have traditionally focused on the security needs and practices of the average users, sidestepping underrepresented and vulnerable communities. In recent years, there has been an increase in research examining the security and privacy behaviors of such groups, revealing nuanced and community-specific concerns and practices that differ from those of the average users [1, 2, 3, 4, 5, 6]. One such vulnerable online population consists of employees working for civil society organizations (CSOs), which include a wide range of groups, such as humanitarian organizations, labor unions, advocacy groups, indigenous peoples movements, faith-based organizations, community groups, professional associations, foundations, think tanks, charitable organizations, and other non-governmental and not-for-profit organizations [7]. Compared to other sectors, civil society organizations operate in elevated-risk contexts, as they are often targeted for political or ideological reasons by state-sponsored actors [8], political opponents [9], hate groups [10], and radicalized individuals [11]. Whereas attacks against average users and for-profit organizations mostly result in financial losses [12], attacks against individuals working for politically vulnerable CSOs often carry greater ramifications, including, in severe cases, threats to freedom of expression, liberty, and even life [13, 14, 15, 16].

Prior research indicates that civil society groups lack the funds and human resources to defend themselves against security and privacy threats [17]. For instance, they maintain a low ratio of IT staff to non-technical staff [18], do not conduct vulnerability assessments [19], and do not adopt solutions aimed at improving their cybersecurity [20]. Findings from a 2018 report by the Public Interest Registry [21] indicate that CSOs rarely have access to purpose-built systems, and instead tend to use commodity tools that are not tailored to their needs and elevated risk profiles. For instance, 58% of surveyed CSOs use Facebook messenger, which is not encrypted by default, to communicate sensitive information [21]. Although recent attempts have been made to design security solutions that are tailored specifically to CSOs [22, 23], they often fail to capture the needs, practices, and mental models of their intended users [24, 25].

In November and December of 2019, we conducted an anonymous online survey with 102 CSO employees to collect information about their risk perceptions and self-reported mitigation strategies. We surveyed employees at a broad range of organizations based in the U.S., some of which could be classified as high-risk and others as low-risk, in order to compare their average perceived risks. The survey also measured several other factors that we believe affect the attitudes and intentions of individuals to engage in protective behavior (e.g. risk awareness, self-efficacy, perceived support, etc). We also collected organizational and demographic information. Finally, we asked the participants to provide us with feedback at the end of the survey to help us improve future iterations of the study.

In the rest of this article, we discuss the motivation for our work, the methodology for the preliminary study, the challenges we faced, and the lessons we learned from our experience and the respondents’ feedback. We hope that this discussion will also benefit other researchers and practitioners working in this area.

Motivation

One could try to dismiss cyberattacks against civil society and elevated-risk users as “edge cases” that deserve less attention than more sophisticated technical attacks, or threats that affect broader user populations. However, there are several reasons why understanding the context in which CSOs operate is essential. First, employees working for CSOs constitute a sizable proportion of the population. In the U.S. alone, they account for 11.4 million jobs or 10.3% of the non-public sector workforce [26]. Second, most CSOs employ standard tools used by millions of users [21], while their online risks are amplified compared to the general population [17], and so this particularly vulnerable population could be considered “extreme users” [27]. Therefore, understanding how CSO employees use mainstream tools could reveal insights about usability and security issues that might be overlooked in studies with typical user communities. Such insights will help to improve the design of technology for average use-cases as well. For instance, enabling key security choices by default in popular platforms would improve the security outcomes both for high-risk and average users. Finally, cyberattacks that target CSOs today are precursors of threats that could affect broader user groups in the future [28]. Understanding how to protect against such threats for high-risk users would, therefore, confer security for average users.

Methodology

The goal of our study is to better understand cybersecurity concerns and practices in CSOs to improve their resilience against cybersecurity attacks. Based on our personal experience working with CSOs and prior work with journalists [2], activists [25], and humanitarian workers [29], we identified the following seven threats: phishing, malware, online harassment, online reputation attacks, physical device compromise, surveillance, and attacks on online services. We designed and executed a survey among employees at CSOs to assess the respondents’ risk perceptions of each of these seven threats, and to collect information on their self-reported risk mitigation strategies for one specific threat chosen at random. Additionally, the survey presented a list of strategies that correspond to best practices for mitigating each threat, and asked participants to report their level of familiarity with these strategies and whether they have used them.

In addition to following standard questionnaire design guidelines, such as designing scales to measure the constructs, minimizing survey response time, and protecting the confidentiality of responses, we addressed several considerations that were specific to employees at CSOs: establishing trust, preserving anonymity, and recruitment challenges.

Establishing trust. While communicating research risks is essential in any study that involves human subjects, it is especially important when surveying vulnerable populations. More specifically, revealing information about the current security practices and priorities of a CSO could place its employees and the organization as a whole at heightened risk. While our survey was completely anonymous and did not collect identifiers of any kind, we also had to ensure that our respondents felt safe enough to provide information related to our research goals. In addition to communicating our commitment to anonymization in the consent form, we separately highlighted the anonymity of the survey in a separate location in the survey itself, in order to establish trust with respondents and to relieve their concerns. This also increased the chance that respondents were aware of the safeguards employed, even if some of them did not read the consent form entirely.

Using anonymity-preserving incentive strategies. We wanted to provide some incentives to respondents as compensation for their time, and to increase participation rates in the survey. As we are not assigning any identifiers to the survey participants, we cannot follow up with them to provide any direct compensation. Instead, at the end of the survey, respondents can select one of three charities, to which we will proportionally donate our compensation budget at the end of the study. Research has shown that material incentives (either monetary or non-monetary) [30] and sharing result summaries [31] increase response and completion rates for online surveys, without affecting the quality of responses [32, 33].

Using a trusted intermediary for participant recruitment. In order to reach our target population, we partnered with TechSoup, a nonprofit that coordinates an international network of other nonprofits, providing technical support, training, and tools. We distributed our survey in one of TechSoup’s periodic newsletters, which allowed us to leverage their large reach among nonprofits and their existing connections to our target audience. The survey was promoted via a banner ad (see Figure 1), which included an anonymous link to our survey. The content and format of the banner had to accommodate the existing conventions established in TechSoup’s newsletters. Recruitment text with a stronger call-to-action may have attracted more attention, but this compromise was worthwhile as it allowed us to directly access our target population. Additionally, we are able to disseminate key findings back through TechSoup’s platform so that participants who participated anonymously can review and learn from the results of the study.

Figure 1: Recruitment banner included in the email newsletter

Challenges

Previous surveys run by TechSoup using its network achieved from 2,500 to 20,000 responses, and we were optimistic that we would see at least several thousand participants respond to our survey. Unfortunately, only 160 individuals navigated to the survey in the first place; 16 potential participants were excluded because they never progressed past the consent form, 39 participants were excluded because they did not finish the survey, and 3 respondents were additionally excluded due to incorrect responses to attention-check questions, leaving 102 complete responses for our analysis. After analyzing the valid responses and the feedback from respondents, we identified several issues that we intend to address in the next iteration of the survey, including the survey length, incorrect terminology, non-applicable questions, and the design of the banner ad.

Length of the survey. By far the most common feedback we received was that the survey was too long and/or contained repetitive questions (mentioned by 41% of respondents who provided feedback). To assess key factors that affect the attitudes and intentions of participants to engage in protective behavior, we included a total of 11 scales, developed to measure constructs such as risk awareness, self-efficacy, response efficacy, perceived support, perceived culture, and others, each ranging from 2 to 7 items. This resulted in a median completion time of 19.8 minutes, which is longer than the recommended 10 minutes for online surveys [34].

To address this challenge, we suggest narrowing the scope of research questions to reduce the number of constructs measured by the survey, or to use the between-subject approach, i.e. present only a subset of questions to each participant if the expected sample size is large. We also suggest providing feedback to respondents as they are completing the survey, for instance, by displaying their current progress in the survey and explaining that, although questions might seem repetitious, they are in fact measuring different factors.

Incorrect terminology. Another issue we discovered resulted from the use of the word ‘employee’ throughout the survey to refer to the participant. In the beginning of the survey, we clarify that we make no distinction between different capacities of involvement with the organization, and use the word ‘employee’ only for brevity. Nevertheless, 12% of respondents who provided feedback mentioned that they felt confused when responding to questions as their organization has no or few employees, and is composed mostly of volunteers.

To avoid this issue, it is important to remember that individuals engage with CSOs in different capacities, including as employees, contractors, and volunteers. When referring to the respondent directly, one approach to solving this problem is to use neutral phrasing to encompass anyone working at a CSO (e.g. “as someone working for a civil society organization, consider the following…”). For questions that involve the organization itself, we recommend exhaustively listing different options to avoid any confusion (e.g. “How many individuals (including employees, volunteers, contractors, etc.) currently work for your organization?”).

Non-applicable questions. Our aim was to survey a broad range of CSOs, regardless of the cause they support, their position in the industry, or their size, which meant that some questions had to be broad enough to cover all types of organizations. For this reason, however, 17% of respondents who provided feedback mentioned that they were not able to answer some of the questions as they were not applicable to them. For instance, some respondents mentioned that their organization was too small to have information security policies, or that they do not report to the IT department because they are the IT department. For our single- and multiple-choice questions, we only included a ‘Don’t Know’ option and, thus, failed to account for questions that do not apply to certain respondents.

To address this problem, we recommend including a ‘Not Applicable’ option for each question, alongside an optional free-text box that can be used to provide additional comments and clarifications.

Recruitment banner design. We also identified an issue with the recruitment banner included in TechSoup’s email newsletter, as shown in Figure 1. Some members of our research team who had access to the body of the email reported that it was not clear whether the banner had to be clicked to navigate to the survey, and others reported that it resembled an advertisement banner that would be ignored. We did not have much control over the design of the recruitment banner, as it followed TechSoup’s design guidelines, but we believe this might have contributed to non-response bias, especially if it disproportionately affected those who did not understand the banner was clickable or those who assumed the banner was an advertisement and ignored it.

To increase the number of prospective respondents, we recommend highlighting the fact that the banner is clickable, either by including a textual explanation (e.g. “Click here to participate”), or a graphic that looks like a clickable button on the banner. We also suggest including supporting text in the body of the email newsletter that brings attention to the survey, emphasizes that the survey is for academic (and not commercial) purposes, sets expectations about the required time commitment, and highlights any compensation offered for participation.

Conclusion

We applied survey-based methods to understand cybersecurity concerns and practices of CSO employees, including their perceived risks of different security and privacy threats, and their self-reported mitigation strategies. The design of our preliminary survey accounted for the unique requirements of our target population by establishing trust with respondents, using anonymity-preserving incentive strategies, and distributing the survey with the help of TechSoup. However, by carefully examining our methods and the feedback received from respondents, we uncovered several issues with our methodology, including the length of the survey, our usage of terminology, non-applicable questions, and the design of the recruitment banner. We hope that the discussion of these challenges will not only assist us in the design of our future studies but will also benefit other researchers and practitioners working on understanding and improving the security and privacy of CSOs.

Acknowledgments

We would like to thank the Citizen Clinic at the Center for Longer-Term Cybersecurity (CLTC) and members of the Berkeley Laboratory for Usable and Experimental Security (BLUES) lab for their support and for providing expert input and review of our survey instruments. We also thank TechSoup for their collaboration and for providing access to their network of nonprofits during this research project. This research is sponsored by funding from the CLTC at UC Berkeley.

References

[1] Boyd, D., 2014. It’s complicated: The social lives of networked teens. Yale University Press.

[2] McGregor, S.E., Roesner, F. and Caine, K., 2016. Individual versus organizational computer security and privacy concerns in journalism. Proceedings on Privacy Enhancing Technologies, 2016(4), pp.418–435.

[3] Yarosh, S., 2013, April. Shifting dynamics or breaking sacred traditions? The role of technology in twelve-step fellowships. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp. 3413–3422).

[4] Matthews, T., O’Leary, K., Turner, A., Sleeper, M., Woelfer, J.P., Shelton, M., Manthorne, C., Churchill, E.F. and Consolvo, S., 2017, May. Stories from survivors: Privacy & security practices when coping with intimate partner abuse. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (pp. 2189–2201).

[5] Blackwell, L., Hardy, J., Ammari, T., Veinot, T., Lampe, C. and Schoenebeck, S., 2016, May. LGBT parents and social media: Advocacy, privacy, and disclosure during shifting social movements. In Proceedings of the 2016 CHI conference on human factors in computing systems (pp. 610–622).

[6] Guberek, T., McDonald, A., Simioni, S., Mhaidli, A.H., Toyama, K. and Schaub, F., 2018, April. Keeping a low profile? Technology, risk and privacy among undocumented immigrants. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (pp. 1–15).

[7] World Bank Group, 2020. Civil Society Policy Forum. Retrieved February 11, 2020 from https://www.worldbank.org/en/events/2020/04/17/civil-society-policy-forum

[8] Lipton, E., Sanger, D.E. and Shane, S., 2016. The perfect weapon: How Russian cyberpower invaded the US. The New York Times, 13. Retrieved February 11, 2020 from https://www.nytimes.com/2016/12/13/us/politics/russia-hack-election-dnc.html

[9] Scott-Railton, J., Marczak, B., Guarnieri, C. and Crete-Nishihata, M., 2017. Bitter Sweet: Supporters of Mexico’s Soda Tax Targeted With NSO Exploit Links.

[10] Brandom, R., 2016. Anonymous groups attacked Black Lives Matter website for six months. The Verge. Retrieved February 11, 2020 from

https://www.theverge.com/2016/12/14/13951762/anonymous-black-lives-matter-ddos-attack-six-months-hacktivism

[11] Glaser, A., 2020. Bail organizations, thrust into the national spotlight, are targeted by online trolls. NBC News. Retrieved June 10, 2020 from

https://www.nbcnews.com/tech/tech-news/bail-organizations-thrust-national-spotlight-are-targeted-online-trolls-n1226321

[12] Accenture and Ponemon Institute, 2019. Ninth Annual Cost of Cybercrime Study. Retrieved February 11, 2020 from https://www.accenture.com/us-en/insights/security/cost-cybercrime-study

[13] Marczak, W.R., Scott-Railton, J., Marquis-Boire, M. and Paxson, V., 2014. When governments hack opponents: A look at actors and technology. In 23rd USENIX Security Symposium (USENIX Security 14) (pp. 511–525).

[14] Marczak, B., Scott-Railton, J. and McKune, S., 2015. Hacking team reloaded? US-based Ethiopian journalists again targeted with spyware. Citizen Lab, 9.

[15] Crete-Nishihata, M., Dalek, J. and Deibert, R., 2014. Communities@ Risk: Targeted Digital Threats Against Civil Society. Citizen Lab, Munk Centre for International Studies, University of Toronto.

[16] Deibert, R.J., Rohozinski, R., Manchanda, A., Villeneuve, N. and Walton, G.M.F., 2009. Tracking GhostNet: Investigating a cyber espionage network.

[17] Brooks, S. Defending Politically Vulnerable Organizations Online.

[18] Hulshof-Schmidt, R., 2017. Nonprofit Technology Staffing and Investments Report. Retrieved February 11, 2020 from https://www.nten.org/article/your-guide-to-nonprofit-it-investment/

[19] CohnReznick, 2017. Not-for-Profit Governance and Financial Management Survey. (2017). Retrieved February 11, 2020 from

https://www.cohnreznick.com/insights/2017-not-for-profit-governance-financial-management-survey

[20] Sierra, J.L., 2013. Digital and mobile security for Mexican journalists and bloggers. Freedom House.

[21] Public Interest Registry and Nonprofit Tech for Good, 2018. 2018 Global NGO Technology Report. Retrieved February 11, 2020 from

https://www.givingtuesday.org/lab/2018/03/2018-global-ngo-technology-report

[22] Google, 2019. Advanced Protection Program. Retrieved February 11, 2020 from https://landing.google.com/advancedprotection

[23] Lerner, A., Zeng, E. and Roesner, F., 2017, April. Confidante: Usable encrypted email: A case study with lawyers and journalists. In 2017 IEEE European Symposium on Security and Privacy (EuroS&P) (pp. 385–400). IEEE.

[24] McGregor, S.E., Charters, P., Holliday, T. and Roesner, F., 2015. Investigating the computer security practices and needs of journalists. In 24th USENIX Security Symposium (USENIX Security 15) (pp. 399–414).

[25] Marczak, W.R. and Paxson, V., 2017. Social Engineering Attacks on Government Opponents: Target Perspectives. Proceedings on Privacy Enhancing Technologies, 2017(2), pp.172–185.

[26] 2014. Nonprofits account for 11.4 million jobs, 10.3 percent of all private sector employment. (2014). Retrieved February 11, 2020 from https://www.bls.gov/opub/ted/2014/ted_20141021.htm

[27] Djajadiningrat, J.P., Gaver, W.W. and Fres, J.W., 2000, August. Interaction relabelling and extreme characters: methods for exploring aesthetic interactions. In Proceedings of the 3rd conference on Designing interactive systems: processes, practices, methods, and techniques (pp. 66–71).

[28] Scott-Railton, J., 2016. Security for the high-risk user: separate and unequal. IEEE Security & Privacy, 14(2), pp.79–87.

[29] Le Blond, S., Cuevas, A., Troncoso-Pastoriza, J.R., Jovanovic, P., Ford, B. and Hubaux, J.P., 2018, May. On enforcing the digital immunity of a large humanitarian organization. In 2018 IEEE Symposium on Security and Privacy (SP) (pp. 424–440). IEEE.

[30] Göritz, A.S., 2006. Incentives in web studies: Methodological issues and a review. International Journal of Internet Science, 1(1), pp.58–70.

[31] Dillman, D.A., 2011. Mail and Internet surveys: The tailored design method — 2007 Update with new Internet, visual, and mixed-mode guide. John Wiley & Sons.

[32] Gritz, A.S., 2004. The impact of material incentives on response quantity, response quality, sample composition, survey outcome and cost in online access panels. International Journal of Market Research, 46(3), pp.327–345.

[33] Sánchez-Fernández, J., Muñoz-Leiva, F., Montoro-Ríos, F.J. and Ibáñez-Zapata, J.Á., 2010. An analysis of the effect of pre-incentives and post-incentives based on draws on response to web surveys. Quality & Quantity, 44(2), pp.357–373.

[34] Revilla, M. and Ochoa, C., 2017. Ideal and maximum length for a web survey. International Journal of Market Research, 59(5), pp.557–565.

--

--

Center for Long-Term Cybersecurity
CLTC Bulletin

CLTC helps individuals and organizations address tomorrow’s information security challenges to amplify the upside of the digital revolution.