Chapter IV — Untangling responsibilities in the digital world

Zubair Ashraf
Digital Narratives
Published in
6 min readFeb 2, 2024
Image by Suriya.

The question of where the platform responsibility ends and where that of the user starts is a difficult one (Helberger, Pierson, and Poell, 2018, p. 2). Most of this discourse is grounded in the host-editor dichotomy (Horten, 2016; Angelopoulos and Smet, 2016; Hoboken, 2009; Helberger, 2011, cited in Ibid.). If the platforms qualify as hosts, then, under the European e-commerce regime, they have limited responsibility (Ibid.). And if they qualify as editors, then they have full responsibility for what is shared through them (Ibid.). Platforms emphasise their role as facilitators or hosts (Helberger, Pierson, and Poell, 2018, p. 7). However, considering the content circulation on platforms, they appear not just as hosts but as vital actors (Ibid.). They create techno-commercial infrastructures geared towards increasing user engagement and virally spreading content (Gerlitz and Helmond, 2013, cited in Ibid.).

However, this black and white allocation of responsibility does not reflect the actual role and capacities of platforms to prevent certain undesirable outcomes (Hoboken, 2009; Horten, 2016, cited in Ibid.). It is pertinent to mention that while platforms fundamentally shape users’ activities, they do not determine what is shared through them (Helberger, Pierson, and Poell, 2018, p. 2). For example, WhatsApp allows users to share text, images, videos, gifs, pdfs and stickers etc. They don’t control what content these medias would be containing. Many of the problems on online platforms are, to some extent, user-driven (Ibid.).

If multiple actors, including users, platforms and the state, are responsible for shaping the digital space, then it seems to be a problem of many hands (Ibid., p. 3). The preceding terminology refers to a situation in which different actors can contribute in their distinct ways towards a problem or to its solution, without explicitly identifying who is responsible for which actions and what consequences (van de Poell et al., 2012; Doorn, 2012, cited in Ibid.). The problem of many hands has also been defined as a “morally problematic gap in the distribution of responsibility” (van de Poel et al., 2012, p. 62, cited in Helberger, Pierson, and Poell, 2018, p. 3). Legally speaking, the issue is about fixing responsibility on one or more actors in an enforceable way, as neither the platforms nor the users can be held fully accountable (Helberger, Pierson, and Poell, 2018, p. 3). Fahlquist (2009, pp. 115–116, cited in Ibid.) argues that capacity and power entail responsibility, meaning that if an agent, be they an individual or an institution, has the capacity, power and resources to solve a social problem, then they have a responsibility to do so. Now, the question arises whether the users of the platforms have these advantages?

Speaking of the users, they alone will not be able to make the platforms safer spaces. Generalising this responsibility on users to be careful with what they share through digital platforms would not be appropriate, because some people, with the capacity, knowledge and inclination would do so, while others who are not as privileged or concerned would not (Helberger, Pierson, and Poell, 2018, p. 3). It is almost impossible to imagine that members of the public voluntarily turn away from the sight of a woman or any other person exposed against her or their will (Franks, 2017, p. 4). In the case of nonconsensual intimate images and videos, abbreviated as NCIIs, lack of consent is the point of concern (Ibid.) Although acknowledging user responsibility is beneficial, it is crucial not to overlook the institutional responsibility of platforms.

Platforms have a responsibility of their own, such as complying with data protection laws and requests to take down potentially harmful or offensive content (Helberger, Pierson, and Poell, 2018, p. 3). They have an obligation to create the conditions that allow users to act responsibly, such as creating awareness, informing and educating users (Ibid.). Not only the platforms, but other organizations like the non-profit Digital Rights Foundation Pakistan have been running campaigns to raise awareness about digital security. And it has been observed that people with greater awareness of laws and rights are more likely to report digital crimes (Basit, 2022, p. 11).

Helberger, Pierson, and Poell (2018, p. 4) argue that in the case of platforms, architectural design choices play a similar role as laws and procedures allocate and distribute responsibility in institutions. They assert that reviews of design choices for flagging mechanisms, configuration of recommendation, and incentives for users to engage with content can encourage the public to report NCIIs (Ibid.). They add that platforms’ terms of use also have a role, as it is here that the platforms allocate the responsibility between them and their users, notwithstanding that the distribution would be fair and balanced (Ibid.). But do the platforms take their self-proclaimed responsibility seriously?

WhatsApp’s responsibility:

WhatsApp (2021) prohibits the use of its service for illegal, harassing and obscene activities, warning of disabling and suspension of the user account violating its terms of use. However, it seems that either WhatsApp has not been complying with its own-set responsibility or does not have the capacity to remove reported NCIIs from its platform. For example, during this research, it was observed that nude and intimate videos of TikToker Hareem Shah, YouTuber Aliza Sehar and singer Rabi Pirzada still have been circulating in WhatsApp groups and social media channels dedicated to leaked content, despite being reported to the authorities (Arora, 2023; Asad, 2023; Afzal, 2019). Basit (2023) says it is almost impossible to get NCII removed from WhatsApp. Franks (2017, p. 20) maintains that platforms have a huge capacity for shutting down malicious activities; it is only a question of will.

WhatsApp has also introduced a flagging mechanism (Sen, 2021), which is defined as a sociotechnical instrument for reporting potentially harmful or offensive content to a platform (The Media Manipulation Casebook). Content can be flagged by an algorithm, content moderator or user (Ibid.). According to Crawford and Gillespie (2016, p. 411, cited in Helberger, Pierson, and Poell, 2018, p. 7), the flag is not simply a technical device but a marker of interaction between users, platforms, humans and algorithms as well as broader political and regulatory forces. Therefore, instead of allocating responsibility to one central actor, the roles of a variety of actors should be identified (Helberger, Pierson, and Poell, 2018, p. 7).

Finding the right balance between moderation of contentious content and enabling freedoms of speech and expression is an ongoing experiment (Ibid., p. 8). As contentious content tends to circulate from one platform to another, it is important to view the management and regulation of such content from an ecological perspective (Ibid.). This is where the governments come in to provide a framework for sharing responsibility among all key stakeholders (Ibid.). For example, following the 2016 US elections marred with the allegations of targeted misinformation and conspiracy theories campaigns in digital spaces, the UK, Germany, and other European Union countries have explored regulatory measures to compel platforms to remove contentious content (Faiola and Kirchner 2017; Mukaddam 2017, cited in Helberger, Pierson, and Poell, 2018, pp. 7–8). Responsibility involves all stakeholders and can take different forms, such as the platform’s organizational design responsibility, user’s participation responsibility and the government’s responsibility to create a framework for the implementation of these responsibilities (Helberger, Pierson, and Poell, 2018, p. 11).

This discussion showed that the NCII issue is a problem of many hands and, in this case, a platform’s architectural design can become a useful tool in enabling cooperation among all stakeholders, especially users, to flag an NCII so that it can be taken down as soon as possible, before being copied, downloaded, and uploaded on other platforms. In chapter V, I discuss some existing and proposed technological measures aimed at curbing sexting abuse.

--

--

Zubair Ashraf
Digital Narratives

Journalist - MA Digital Narratives - Labor Policies and Globalization