Chapter VI — Conclusion
Sexting or phone sex has emerged as a tool for people to express their sexuality or fulfil their sexual desires with their partners and lovers. However, practitioners of sexting can fall victim to the spread of their intimate content on the internet without their consent. This puts leaked content under the purview of NCII. In the worst cases, this content could make its way to the internet following a break-up between a couple or it could be maliciously done by hackers or data thieves or sent by mistake to others or shared publicly.
Many factors can keep victims from reporting this crime, such as the fear of further victimization and embarrassment due to the private nature of the content. This can cause never-ending damage to the victims as they live with the fear of being recognized, not knowing how many people have watched the content, and how many copies of it are out there.
It is imperative to stop such content before it goes viral, because by the time a victim discovers that their intimate photo or video has been leaked on a digital platform and requests its removal, the media may have been downloaded, forwarded and posted by several other internet users. There are several WhatsApp and social media groups that specifically host leaked intimate content and charge money for joining these digital communities.
The spread of leaked content seems to be a highly gendered phenomenon, as most porn websites feature far more women than men, and the majority of court cases and news stories involve female victims and male perpetrators. This suggests that the increased use of information and communication technology has led to the rise of a new form of gender-based violence.
WhatsApp, Facebook, and Instagram are the platforms where most, up to 53%, cases of cyber harassment have occurred in Pakistan. From the platform responsibility point of view, it seems that few platforms have adopted measures to stop the circulation of leaked intimate content, especially the commonly used applications that have done very little.
With the use of digital technologies for our everyday life and relationships seeming indispensable, the privacy of individuals has become fragile. Online intimate communication and expressions are vulnerable to nonconsensual exposures, which can cause endless negative impacts on survivors.
It is pertinent to mention that while platforms fundamentally shape users’ activities, they do not determine what is shared through these platforms. For example, WhatsApp allows users to share text, images, videos, gifs, pdfs and stickers etc. They don’t control what content these medias would contain. Many of the problems on online platforms are, to some extent, user driven.
If multiple actors, including users, platforms and the state, are responsible for shaping the digital space, then it seems to be a problem of many hands. The problem of many hands has been defined as a morally problematic gap in the distribution of responsibility. Legally speaking, the issue is about fixing responsibility on one or more actors in an enforceable way, as neither the platforms nor the users can be held fully accountable.
However, users alone will not be able to make the platforms safer spaces. Generalising this responsibility on them, to be careful with what they share through digital platforms, would not be appropriate because some people, with the capacity, knowledge and inclination, would do so, while others who are not as privileged or concerned would not. Although acknowledging user responsibility is beneficial, it is crucial not to overlook the institutional responsibility of platforms.
Platforms have a responsibility of their own, such as complying with data protection laws and requests to take down potentially harmful or offensive content. They have an obligation to create the conditions that allow users to act responsibly, such as by creating awareness, informing and educating users. It has been observed that people with greater awareness of laws and rights are more likely to report digital crimes.
In the case of platforms, architectural design choices play a similar role as laws and procedures allocate and distribute responsibility in institutions. Reviews of design choices for flagging mechanisms, configuration of recommendation and incentives for users to engage with content can encourage the public to report cyber harassment.
WhatsApp prohibits the use of its service for illegal, harassing and obscene activities, warning of disabling and suspension of the user account violating its terms of use. However, it seems that either WhatsApp has not been complying with its own-set responsibility or does not have the capacity to remove leaked content from its platform. For example, during this research, it was observed that nude and intimate videos of TikToker Hareem Shah, YouTuber Aliza Sehar and singer Rabi Pirzada still have been circulating in WhatsApp groups and social media channels dedicated to leaked content, despite being reported to the authorities.
According to some, it is almost impossible to get NCII removed from WhatsApp. While, according to others, platforms have a huge capacity for shutting down such malicious activities; it is only a question of will. Finding the right balance between moderation of contentious content and enabling freedoms of speech and expression is an ongoing experiment. As contentious content tends to circulate from one platform to another, it is important to view the management and regulation of such content from an ecological perspective. This is where governments come in to provide a framework for sharing responsibility among all key stakeholders.
The issue of nonconsensual intimate images and videos (NCII) is a problem of many hands, and, in this case, a platform’s architectural design can become a useful tool in enabling cooperation among all stakeholders, especially users, to flag an NCII so that it can be taken down as soon as possible, before being copied, downloaded and uploaded on other platforms.
By having a forwarding control algorithm, image fingerprinting technology and not allowing the content to be saved on devices, platforms can become safer by design and architecture. Such systems, for sure, can help prevent sexting abuse and NCII spread, by giving users agency over their self-generated content and providing a safer online experience. However, this problem can not only be solved by putting these technological measures in place as they may also put restrictions on content that is not harmful or offensive.
It can be said that sexting abuse and NCII situations are interdisciplinary topics, and experts from other fields, such as human rights, digital rights, state and legislature, should also be involved before deploying such measures. Furthermore, this essay recommends research into the emerging digital marketplace for leaked intimate sexual content as well as into the points from where the private data of individuals gets leaked often.