Expert’s Corner with Zara Ward from Revenge Porn Helpline & Stop NCII

Checkstep
Checkpoint
Published in
20 min readDec 19, 2022

This month we had the pleasure of interviewing Zara Ward, senior practitioner at the Revenge Porn Helpline.

The Revenge Porn Helpline (RPH) is a part of the South West Grid for Learning charity and is a practical support service for adult (18+) victims in the UK who have been experiencing intimate image abuse (IIA). These issues could range from; images being shared without consent, threats to share imagery, webcam blackmail or voyeurism. RPH offer information on the laws around IIA and supports clients in reporting content where needed. Taking the burden away from victims of reporting illegally shared content.

The Report Harmful Content (RHC) service offers people who are 13+ a human voice when reporting content to many social media platforms. The Report Harmful Content service are a web-only support service that can escalate harmful content previously denied removal from platforms.

Transcript

S: Welcome. Zara, I thought it’d be great to kick off the session. We have a quick intro about you and then we can take a deep dive into some of the important work the Revenge Porn Helpline is doing in the UK. Does that sound good?

Yes, that sounds perfect. Great. so I’m Zara Ward. I work on the Revenge Porn Helpline.

I’ve been here about three years now, maybe more. I started off as a practitioner and I’ve worked my way up to senior practitioner. The main bulk of what I do is data research and technological research. So, yeah, I process all the data, do the reports, and showcase how technology could hinder people who’ve experienced IIA (Intimate Image Abuse) or support people, positively to protect their images and stuff like that.

My background is working in psychology, psychological research. So yeah, data is my thing!

S: What was the reason that you joined the Revenge Porn Helpline?

I just finished university and I kind of went off travelling for a year and I was like, I really, really want to get back into helping people and giving back and I saw an event come up for the Revenge Porn Helpline and I was like, hmm what is this? and so I looked into it and, and realised the scale of the problem and was like, “yeah, okay, this is what I can see myself doing” and here I am still doing it.

S: Why was the Revenge Porn Helpline setup? What is its origin story?

The Revenge Porn Helpline was set up in 2015. So, our charity sits underneath a wider organisation called the South West Grid for Learning and one of their helpline branches is the Professionals Online Safety Helpline.

So that supports the Children’s Workforce, supports teachers who are experiencing issues relating to schools or relating to children. One of the managers, Laura, used to see quite a lot that intimate image abuse or revenge porn was quite commonplace in these kinds of situations, and it was actually against the law.

So as it became more of a problem, from 2015 when the law was introduced, they thought, “oh, well, let’s set up a helpline and see how popular it becomes” and over the years it’s increased in need and the support has become so much more developed and complex. So yes, we are very very busy!

S: What is the current law in the UK with regards to intimate image abuse? And how is that being monitored and more importantly, enforced?

That’s a really interesting one. So we know that the law was made to, well, it is illegal to disclose private sexual, intimate imagery with the intent to cause distress from 2015.

That is the specific law. It has to be, private sexual imagery, and it has to be with the intent to cause distress. So, as you can imagine, the intent to cause distress is a really difficult thing to enforce.

So we call it like the LOL clause. You can just say that you did it for a laugh or you did it because she was fit, or they were fit, or whomever and that gets you out of that causing distress tenant of the law. So that happened in 2015 and since then we’ve been kind of campaigning to get the law changed.

Recently the Law Commission has done a review and they recommend that the law can be adjusted to remove that attention from the base offence and have bolt on offences. Hopefully, depending on the status of our government for the next couple of years…we’re hoping that that will get enforced.

More recently in 2019 upskirting was made illegal, which was a part of the voyeurism offence. So that’s where somebody takes a picture up somebody’s skirt or or down their blouse.

Then even more recently again, threats to share intimate content was added to the Domestic Abuse Bill to make that its own specific offence but again, enforcing something like this is incredibly difficult because it’s online most of the time. You don’t have direct cause and effect, you don’t have evidence, you don’t have information. There is a lot of the ownership on the victim in order to keep a lot of this as logs and a lot of this is evidence.

Because these crimes are relatively new comparatively to other crimes, like, you know, murder is a very old crime, whereas intimate image abuse is under seven years, a lot of police forces aren’t knowledgeable about it. A lot of police forces feel very much like a fish out of water when it comes to enforcing these kinds of crimes. So we hear a lot of the time that it’s about a 50% split whether people get a positive or a negative response from the police and a positive response is just the police doing their job. We know that the institution of these kinds of crimes needs to change. I think it is slowly changing, but it does take a very long time to kind of drip down.

S: I think there’s also the element, a large part, which is on the education piece. So everyone from the victim, to the police, all the way through to the person who’s sharing those images, they need to know what that means and the consequences that lead on from there. And I suppose again, even from the side of law enforcement, knowing what they need to do and where improvements can be made as well?

We hear a lot of the times that still now police officers will say, “just turn your phone off” and “just forget about it. It’s online”. “It shouldn’t affect you or you shouldn’t take the picture in the first place or crop out your face, or your face isn’t in it. So why does it matter?”

Like there is such a different view of a physical body when it’s in a picture and it’s online than like a person’s body. There needs to be a lot of education around respecting people’s intimate content… like gifts. You wouldn’t just treat that horrendously. Hopefully the more we educate people on these subjects, and I think we are bringing about quite a woke generation in the next few years, which is brilliant.

I think people are learning a lot more about the importance of treating each other better and hopefully that will reduce the crimes, but who knows.

S: In terms of the demographics, like gender, sexuality, ethnicity, age, have you identified from your research like any higher risk groups when it comes to revenge porn?

So we don’t really capture that much demographic data because we are a confidential helpline but we know that when it comes to specific cultural groups, if somebody’s got a sexual image of image of them, that can be hugely different. They can suffer abuse, they can suffer real threat of violence, real threat on their lives but also somebody in a domestically violent or abusive relationship, any kind of picture of them getting out there can be absolutely dire for them, so it kind of depends on the circumstances.

It also depends on your job role. So if you are a teacher or a doctor etc, that could completely ruin your career.

Because the victim is still kind of slut shamed in these situations and all of this kind of stuff has to be investigated, we’ve had issues where people who are foster carers have been put under investigation because an image of them has been illegally shared and that is horrendous to not only attack a victim when they’re the most vulnerable, but to also take away so much of their personality.

We know that it disproportionately affects women having intimate images shared and women, when it comes to having intimate images shared, women’s images are shared more widely. So I think our statistic is like, overall, I think it’s like one image for a man and like 40 images for a woman, so per woman and per man who comes to the helpline with images to report. Just tiny. It’s a tiny amount. Like, like for male. I’m not saying that any image is too many images, but if it’s like so much more for women, that shows how much it’s being re-shared and how much it’s getting out there. Therefore its impacts are a lot more different.

S: I know when we discussed before, we were speaking about the BBC News article that had come out about the secret world of trading nudes and also the articles that had come out about Reddit as well, and how they’re tackling this from a big tech perspective.

There is an offshoot of revenge porn labelled collect collector culture. Is this something you’re finding more of and how does that differ from a few years ago? Because I know now there’s more of a financial element that also plays into this. It’s not just the revenge part of someone who’s maybe been wronged, or feels justified, but there’s also this financial element that’s come into the mix over the last few years.

Yes. I think because people are becoming more knowledgeable about how to find content of themselves and how to find if they have been talked about online, so people are becoming more knowledgeable about doing searches for themselves.

These people are becoming more privy to that kind of information. So instead of writing Zara for instance, they’ll do Z4RA to make it harder to find and then if that doesn’t work, they’ll, they’ll monetise it. They’ll put it behind a paywall, because if you want to see that person’s images so badly and they don’t have an OnlyFans; they’re not an open and free person to have that kind of content, then they’ll try and get it other ways and collector culture has been a huge issue that we’ve seen for a while. Way back when like Hunter Moore, IsAnyoneUp?, when he shared all the pictures of people locally that is pretty much what’s happening now but it’s geographical so people will find forums based in Devon or based in Cornwall or based in wherever, and it will break down further and further into different towns and they’ll say, “have you got images of such and such?” and somebody will anonymously post it and then, if somebody’s got a whole collection of that imagery, they can use that to sell the content. It’s the way that people are passing and trading content like a card game, it’s ridiculous.

Now I think the internet is such a big place, but also when you drill it down to that kind of granular level, it’s tiny to know that people in your local area could be potentially searching for your sexual content is a whole level of privacy issues that people haven’t even thought about before.

S: I think the first thing that comes to my mind is also security and safety sharing certain images, we saw this on Reddit. It then spiralled into something where this lady was being blackmailed and there were threats against sharing the information and the images with her parents and she came from a Muslim background, so it then became something very sinister. I think that the guy who was caught and actually confessed to it said he didn’t do it with intent to injure or harm, but inadvertently put her into a very unsafe place she could have come to harm or, you know, her parents could have disowned her, for instance.

Yeah, and people don’t think about that kind of stuff like I see that a lot on these kind of Reddit and those forums of being like, “What’s her @? What’s her @? What’s her Instagram? What’s her Twitter?” and it’s like, that’s none of your business; they are not a public person, they are not somebody who you can go on and scrutinise horrendously.

I think last year I did this research. I was really interested in the way that people become knowledgeable about their content being shared online because I was thinking, putting myself into that situation, how would I want to know the worst news imaginable and I would want my mum to tell me, because I love my mum.

My mum knows how to talk to me so I was like, “okay, let’s do a bit of research and figure out archivally, how people get to know this information”. So I broke it down into four things. I broke it down into; 1) they found themselves, so they could have used Google, they could have used facial recognition to find themselves; 2) they could have been told by the perpetrator who actually uploaded the images or made threats to share the images; 3) they could have been cold messaged by somebody on social media; or 4) they could have been told by their friends’ and family, so all of those options are absolutely horrendous but the one that was so surprising to me, is being cold messaged was 25% of the cases that I looked into. So that is people just blind messaging you, they found your Instagram or they found your Twitter and they’ve got your naked images and they’ve said, “I’ve seen these of you online”. That is the worst possible outcome to me personally, because there is no safeguard there. You are completely bamboozled by that situation.

You’ve not prepared yourself for it. Your parents would prepare you, your friends and family would prepare you for that kind of conversation. The perpetrator could potentially prepare you (maybe negatively). If you were searching for yourself, you could potentially prepare yourself but, being thrown into that conversation when you’re just looking at like, funny cat videos is bonkers to me and the outcomes are so diverse and that to me is the scariest thing because you’ve lost your privacy. You’ve lost your privacy to your body, privacy to where you’re sharing your personal photos and just your life generally.

It’s horrendous.

S: The word that came to my mind was feeling violated. If that were to happen to me and someone were to message me and be like, “hey, I’ve got these images of you”. I would feel very unsafe. I would feel like there’d been a violation on me as a person and I suppose the impact of that psychologically is not just that one moment in time or what rolls out from that interaction. How do you get over that and then trust and feel safe using the internet or sharing information about yourself or pictures of yourself ever again?

Yes, totally and, how you see and how you interact with other people.

All of our relationships now are mainly online. You know, you meet people when you go out for dates, you meet them online, or it’s very rare that you meet offline. Initial contact with somebody we are doing this online. Imagine that every single aspect of your life has been online and then you got this message saying, “here’s your images”.

Then most of the time they’re on a burner account and most of the time they’re not going to tell you where those images are; they’re just going to send you an image or a link or something really obscure. You then have to do all the detective work (potentially) as well as processing the issue, as well as then trying to understand where all your personal information has come from. Have you posted anything that could release your location, where you work, where you go to school? It completely changes the game and then every single person you interact with online, from that moment onwards, you’re going to be like, “have they seen it?”, “were they the person who released that content?”, it’s a completely different ball game.

S: Obviously at Checkstep we work with lots of online platforms and one of the things that we were most interested in when we first got in touch with you was what online platforms can do to reduce the chance of intimate images being used and being traded in this way? What is their responsibility legally, but also morally as well?

I think a block first policy is always a really good step. So if you’re getting a report of a private sexual image from somebody, just block it. Like take it seriously because then if somebody appeals it and says, “oh no, actually that’s a picture of a famous porn actress”, absolutely fine.

There’s more than enough porn on the internet. I think it wouldn’t matter if something went down for a couple of hours, but it does say that we are putting victims first. Rather than going, “we are putting commercial gain first or views first”, or whatever it is, which happens on a lot of adult websites where they’re like, “yeah, but it’s going to get views”.

It’s like, we know that, but that’s the point; we don’t want somebody’s really intimate moment getting 300,000 views when they’ve not allowed it to be online. If you have a block first policy and you appeal it later, it usually works quite well.

Obviously there should be a lot more safeguard to not allow that kind of content to be shared initially but places are going to share adult content and I think that’s really challenging.

I think having hashing lists as well is a huge plus if content has been verified being NCII. Just popping it through a hash bank and checking it and making sure there is no fingerprints that match up and making sure that that content has been shared in an okay manner, or maybe verifying people who are looking to share adult content.

It’s not perfect because you could still technically share illegally shared content if you are a verified creator. I think there’s a lot people can do to safeguard this sharing of adult content more generally because if you are not a verified person, then I think why should you be allowed to upload loads of sexual content? Then you’ve got no accountability to sharing that content in the first place. If you’re a production team, that’s fine, you’ve got accountability. If you’re a person, you should have accountability for what you’re uploading but unfortunately, that’s not the way the adult platforms work.

S: The idea of block first is really interesting because we’ve seen with other types of online harms that actually, it could be online for a very small amount of time, but it only takes one person to copy that and then all of a sudden it can pop up everywhere.

I think that block first policies are something that’s definitely worth considering and thinking about from that side, it’s very difficult when there’s so much of this type of material being uploaded every millisecond. There’s things that can be put in place to help at least reduce the amount of time that it is visible online.

Yeah, definitely. I think especially if you are on a platform that uploads a high amount of adult content, then it would be hard to say, “well, before you upload it, it can go into a moderation queue, get moderated, and then get uploaded”. That would just be creating a bottleneck but, having that saying of, “if you report something as a victim, we will take action on it straight away”, it reduces the need for human moderation because you only need to moderate it at the appeal stage, and also it shows again, to victims, that we are putting you first because there’s a lot of people who will not even think about reporting it because they’re like, “oh, if I report, they’re not going to do anything” but, if that’s at the forefront of your privacy notices, saying if you report something that you believe is intimate image abuse or IIA, we will take it seriously.

S: And for the users themselves, obviously the platforms can put some safeguards in place, but what do you recommend that a user does if they discover that they’ve been a victim of revenge porn? Or even a family member or a friend of someone who finds out that someone has been a victim of revenge porn.

What would be your advice and what should they do next?

It’s such a difficult situation, you don’t know how, you don’t know how much is going where. If you are a family member or a friend, I think getting in touch with a service, like an emotional support service, get some information from them about how you approach it, especially if that person’s got any additional needs, any mental health needs, just make sure that you’re safeguarding them as much as possible. Also, safeguarding yourself, not opening links., save that link if you can, or screenshot the chat and come to a website, not to toot our own horn, but like the Revenge Porn Hotline where it does have a step by step guide on how you can report content and what you could do so then you can be knowledgeable about that before you then go to your friend because you don’t want to go to your friend and say “oh, by the way, your images are online” and then not have any solution for them.

You can say, “right. I’ve spoken to these people because I just fully wanted to make sure that I was giving you the best advice. This is what they offer to do. Now, would you like me to help you through it? Would you like me to talk to them for you? Would you like to see the images?” Giving them that power and that ability to go, “actually, I’m not in the right headspace right now to view that content and to take that burden on. Yes, could you please talk to them for me?” or, “actually, no, I would like to take responsibility for making all those calls myself”. We do take calls from friends and from family members, because we appreciate the emotional load that that has on talking to somebody when you’re in that really hypersensitive state is really challenging.

If you have a link and it’s live, send it to us. We will do all the digging. We proactively search, we proactively report, we just get confirmation that they’re over 18, they’re the person in the image and that it has been obviously non-consensually shared and then we will go away like little squirrels and just keep going and keep going, keep going.

Our team probably reports on average about four or five hours a day. It’s all manual, it’s all done by a lovely team of pretty much three practitioners at this point. We are persistent and we are annoying. We will keep asking websites to take content down. We will find different ways to get websites to take content down. We offer people advice on how they can remove their personal information from Google, so if it’s attached to their name or, their sexual images are attached to images, so if you’ve got a professional profile, for instance, you don’t want your sexual images to be attached to that if somebody is going search for your name.

We offer advice on how to report it to the police, what the process looks like, how to get more legal support that’s for free. We have all of that information so, if you’re a friend or the person, we will give that information anonymously, we don’t need to know your information. We can just give you that little bit of advice. The more information you can give somebody, the better decisions that they can make for themselves going forward. Which I think is the best thing that they can have in that moment.

S: I know we’ve spoken about it before where we were talking about the fact it doesn’t just end in the reporting and the removal of the image or the images that are online, but it’s the ongoing support that is often required to be able to sort of trust again and be able to interact online again. Where can people go for additional support and help if they need something more in terms of psychological support on these sorts of things?

It’s really challenging because there’s no specific intimate image abuse support service. I think you’ve got Rape Crisis, which is more for offline for rape and sexual assault but, I think they do give some counselling. Obviously the NHS has counselling, but that’s more low level and it wouldn’t be for specific trauma. It’s a kind of piecemeal approach. In reality what we would love to do is to be able to offer some kind of emotional support, because we would see both sides; we would see the practical implications, but also the ongoing emotional implications and I would love to be able to do that or offer some pro bono counsellors in order to do that. That would be fabulous. Alot of people do say it’s like being sexually assaulted a million times. Like every time you open your phone, you are terrified that that person has found you and is showcasing your images again, and you open your phone probably a thousand times a day.; god knows how many times I open my phone. If that’s your house, for instance, how would you feel if somebody was just entering your house that many times a day and making you feel unsafe? That is not something that you can easily get over. People do need very complex trauma enforced therapy in these kinds of large scale cases.

We know that people have needed support and it has peaked and gone down depending on the way that their case is being managed at that time. They could have some time where no content is shared and then perhaps it could come back up again. It’s so case dependent, but ideally, yeah, we’d love to give it, I would just know that it’s going to be quite a long process but, to put that at the front of every single application you go for, for any kind of counselling is to say, “I’ve experienced this level of trauma” and then hopefully, they can adjust accordingly. I think that’s important.

S: So in terms of our Q&A today, that’s all the questions that I have. I want to say thank you Zara. Is there anything else you’d like to add or speak about today while we’re together?

Uh, so I guess the only other thing is probably STOP NCII. So I know we’ve spoken a lot about the reaction of the stuff, so we have kind of noticed people share intimate images. It’s kind of obvious now. I don’t know anybody who hasn’t shared intimate images at this point.

We wanted to build something that was preventative rather than reactive. Last year we developed a stop non-consensual intimate image sharing tool. So Meta financed it and we got third party support from InCrev who have been amazing, the software developers.

They have built a hash bank, basically what already existed for child sexual abuse material, but for adult material. So what’s so different about this is that it’s on your device; the whole hashing algorithm would be on your device. It’s web-based and your images never leave your device.

They never come to me, or anybody else and the hash is the only thing that gets sent to the cloud; the hash of that intimate image. Then participating platforms can run the hash bank and match any content, then that would lead to a human moderator. It kind of gets you to the front of the queue when it comes to human moderation or when it comes to being notified.

It’s retrofitted as well. If your content has been shared previously, you add your image to the hash bank, or your hash to the hash bank later on down the line, it should retrofit and it should check previous uploaded content.

It seems to be really popular. It is genuinely really, really easy. So if you are feeling like there’s an image of yourself that is sexual and you don’t want it online or you want to safeguard it for now or in the future, I would really recommend just looking into it.

I think a lot of us here have used it. We’d be really bad practitioners if we haven’t and we see the use in it if you’re out dating or in the very normal sense if that’s what’s happening.

Also if there’s any kind of images of you that have been shared in the past and you have access to, so say you found an image of yourself on xHamster or wherever, you could download that image and hash it and then in the future, if we get xHamster onboard as a platform, it would take all that content off. It just offers a level of safeguarding.

Obviously you have to be an adult and it has to be a sexual image so genitals; it can be a video, you can have somebody else in the video, you can have as many people as you like, as long as you are in it; it doesn’t have to have your face in it, it just has to be naked or near naked.

I’m the first level of support so, if anybody has any questions, then they can just email us stopncii@swgfl.org.uk and I will answer any questions. I’m the first port of call pretty much for any questions on that.

Helpful links

Revenge Porn Helpline

South West Grid for Learning Charity

Stop NCII + explainer video here

Report Harmful Content Service

NHS — Access to mental health services

Rape Crisis

Victim Support UK

--

--

Checkstep
Checkpoint

AI to boost good content 🚀 and moderate bad content ⚠️ — Manage a UGC platform? Say contact@checkstep.com