TW: This article references sexual assault and deepfake porn exploitation.
Deepfake porn is fast becoming one of the greatest threats to women online. In the wake of sexually explicit deepfakes being created of Taylor Swift, there are renewed calls for greater legislative protections against this form of online abuse – not to mention more severe punishments for those who commit it.
It's the topic of a new documentary – Another Body: My AI Porn Nightmare, directed by Sophie Compton and Reuben Hamlyn – airing tonight on BBC Four. The documentary follows Taylor Klein, a 22-year-old American woman who discovers pornographic deepfakes of her and her friends online. When the police fail to intervene, she tracks down the perpetrator herself. To say this is a ‘must-watch’ is an understatement.
Here, GLAMOUR revisits Jess Davies' investigation into deepfake technology, exploring how it's being used as a tool to intimidate and punish women.
I sat in heavy silence, scrolling through hundreds of requests from men, all asking each other for the same thing: to help them make deepfake porn.
I was researching this new technology for a BBC Three documentary and uncovered an endless stream of women's images, seemingly all stolen from their social media accounts and posted in forums alongside the men’s desires; ‘Would anyone be so kind and nudeshop this girl from my town’, ‘Someone can fake c*m or swap my mother’, ‘Can you try this milf 41 and her daughter 18 please’.
One post simply requested ‘F*cked on her back please’ alongside an image of a woman fully clothed, holding a baby. This technology was no longer being used for a fun, five seconds entertainment that saw us swap our mates’ heads into a Miley Cyrus music video. This was sexual exploitation on a terrifying scale. Deepfake porn became a viral topic of conversation after Twitch streamer Brandon Ewing, who also goes as ‘Atroic’ online, was caught accessing and later admitted to buying and watching non-consensual deepfake porn of his fellow female Twitch streamers.
One of the women featured on the subscription website who had been deepfaked was Blaire, a streamer who goes by the name QT Cinderella. When I spoke with Blaire about her experience and how it made her feel to see herself in sexually explicit content without her consent, she shared how being deepfaked, and the misogyny towards her online has made her contemplate coming offline altogether.
“You lose faith in humanity when you’ve been on the internet for so long. It’s been really sad to see some people’s opinions on it. I think regardless like, it’s your face but maybe it wasn’t our bodies but you’re making money off of my face without consent or you’re sexualising me for literally existing.”
“It's really sad when people say like ‘Oh it's not your body why do you care?’ I think that's what a lot of people online were saying and I almost had the same opinion before I saw the photos but I felt a psychological shift […] I assure you that once it happens to you or maybe someone in your life where this is happening against their will, you might be able to have a better grasp of how it really impacts you in a way you can't control’”
“While the content may not be ‘real’, the trauma responses and psychological effects it has on those who are victims of deepfake porn very much exist in the real world.”
While the content may not be ‘real’, the trauma responses and psychological effects it has on those who are victims of deepfake porn very much exist in the real world. Blaire shared how she and other women who had been targeted in the Twitch scandal compared being deepfaked to being sexually assaulted, with one woman feeling so much shame around her body that she couldn’t look in a mirror.
“One of the girls admitted to me that she couldn't shower for days after this because she just felt so gross; she didn't wanna see her body, she was doing anything to avoid looking at her body and the first time that she showered and she took her clothes off, and she walked past the mirror in the bathroom she just sobbed.
“Don't worry about sexually assaulting or sexually harassing your colleagues… you can do what you like.”

“I was sexually assaulted as a child, and the same feeling is there; it’s like once you’ve experienced something like that, you just know, you know, the feeling like the back of your hand every single time you've been taken advantage of by a man in a sexual way. That feeling creeps back into your heart and it was so similar of that feeling the first time I saw that photo. You sit there for a second, and you almost gaslight yourself. Is this real? Did I do this? Because it is so convincing, and there's this weird sense of violation that you feel after a grown man tells you not to tell anybody, right? It's that same level of guilt, that same level of feeling used. You’ve just been used by another guy against your will, and what can you do about it?”
“You’ve just been used by another guy against your will, and what can you do about it?”
A disturbing factor of this online community is that those creating and consuming deepfake porn have decided it is their right to decide who they need consent from when choosing their victims.
Most deepfake websites and forums state that they only host videos or images of celebrities or public figures because they are ‘different to normal citizens’. The threshold on the most popular deepfake porn website states that any model used or requested must have a minimum of 100k followers on either YouTube, Twitter, Twitch or Instagram. Yet I found an approved paid request for a female politician with only 30k Twitter followers and less than 500 Instagram followers.
This so-called moral red line is just another way for men to assert their authority over women and take control of their sexuality. There is no person who is more or less deserving of being put into a hardcore porn film or stripped of their clothes and posted naked online without their consent; once we start blaming women for the exploitative actions of these perpetrators, we begin to fall into a slippery, all-too-familiar hole of victim blaming that we’ve unfortunately all seen before. ‘How much did you drink?’ ‘What route did you walk home?’, ‘What were you wearing?’, ‘You did laugh at his jokes though…'
If you worked in a supermarket and someone came into your workplace and sexually harassed you, you would quite rightly not be expected to ‘put up with it’ and would be encouraged to report this behaviour to the police. That same empathy and understanding should be extended to anyone, regardless of the way they make a living.
“A lot of people say ‘Oh since you’re a part of the internet it’s what you should expect’ but it shouldn’t be. Could we just use the internet for good for once?”
“It’s disappointing," Blair says. "A lot of people say ‘Oh since you’re a part of the internet it’s what you should expect’ but it shouldn’t be. Could we just use the internet for good for once? Could we spread kindness instead of porn of people who didn’t sign up for it?”
Other female content creators logged on to share their disgust at the trend, highlighting its exploitative and non-consensual nature. Ellie Schnitt, a podcast host and social-media influencer, posted in a now-viral tweet: “I’ve said it before with regards to leaking female content creators’ nudes, and I’ll say it again with regards to this deepfake pornography stuff— the reason you’re looking is because the lack of consent turns you on. You like it because these women did not consent. That’s it.”
X content
This content can also be viewed on the site it originates from.
Discussing her decision to speak out about the Twitch scandal, Ellie told GLAMOUR, “It was actually incredible to see these men willingly tell on themselves by saying it's just a fantasy, it's not that deep, or simply telling me to shut up. The question I have is, where does the line get drawn? We all post online constantly. Instagram, TikTok, and Twitter, everyone uses social media and posts photos of themselves. Does that mean they've lost the right to get upset if their image is used in pornographic material against their will?”
“This is something that can happen to anyone. There was a case in New York where a man had been creating deepfake pornography of young women he had gone to school with, using photos from their social media from when they were in high school and even as young as 14. It's incredibly disturbing that this is possible and, I think, even more, disturbing that there is a demand for them.”
Consent is the sticking point when it comes to the creation and distribution of deepfake porn. When I spoke to the owner of one of the largest deepfake porn websites for my documentary, he stated, ‘I don’t really feel that consent is required. It’s a fantasy. It’s not real.’
In previous times, people would live out their fantasies in the privacy of their own minds, but now your deepest, darkest sexual fantasy can exist in the physical world and online for anyone to watch, turning one person's dream into a living nightmare for the woman who’s digital footprint is now altered forever by this sinister trend. When there are literally millions of videos of consensual porn available to view and purchase online, it makes you question what it is that is driving the popularity of deepfakes? Blaire shared, ‘There is a level of fascination that people have over non-consensual content in general.’ And she’s not wrong. Similar to the demand for leaked nudes from hacked iCloud accounts or those scraped from OnlyFans servers, the thrill of viewing a woman’s body without permission is a grim reflection of men’s entitlement when it comes to women’s boundaries and the ownership of our bodies.
Ellie believes that the demand for deepfake porn “absolutely boils down to consent, in my opinion. As I said, there are millions of consenting sex workers out there, and you are choosing to purchase pornography of someone who has not consented to participate in sexual content. It makes people uncomfortable to think that what they're doing is non-consensual, but it should – I think these men need to read that word and really internalise it and start to understand why they should be ashamed of their actions.”
The resources and helplines you need to know.

But it isn’t that the deepfake community do not understand the concept of consent. On one deepfake porn website, I found a whole forum dedicated to reporting other users for stealing video content and re-posting it without credit. One comment reads, “I have been a long-time subscriber to many creators, and it disgusts me to have their hard work been stolen and re-sold like this.” Another points to the Twitter account of someone re-posting deepfake content without permission, encouraging others to report the page to stop him from reselling the “stolen videos as his own.” One user mentions consent specifically, sharing that “Obviously they don’t have consent for the distribution, they are just MOFOs who want to earn with the work of others.”
The hypocrisy of their anger towards having their work stolen without consent would be laughable if it didn’t emphasize that these men do not need lessons in understanding consent. They already understand it. They simply do not care to seek it regarding a woman’s body.
Whether your job is being a celebrity, a twitch streamer or a public servant, no one is asking to be taken advantage of sexually without their consent. The online world reflects the real world, and its users exist in our everyday lives. Turning a blind eye and acting as if the internet is an unlawful place that should be left to its own devices is choosing to ignore an army of perpetrators who spend their spare time sexually objectifying women online before walking downstairs and joining their wives and daughters for dinner, the same women they have just posted a request in a forum of begging other men for “c*m tributes” and “fakes.” It is a terrifying thought that if these men feel so entitled to live out their sexual fantasies non-consensually online, what is it to say that this entitlement does not carry over into reality and onto our streets?
When I asked Blaire what she would say to someone who was thinking of consuming or creating deepfake porn, she said “I would hope that you could find a level of empathy to pause, pause for just a minute and think about all of your friends, looking at a video like this, but of your sister or of your mother and how that would make you feel? And your mother or your sister don't wanna be a part of this and they just have never made sex work of themselves. It's not fair, it's not okay and I imagine that would cause you some sort of visceral reaction and that's how you should feel about any woman that isn't consenting. You should have the same level of empathy as you would for a mother or a sister, because none of us want to be a part of it.”
Unfortunately, this is already happening to mothers and sisters online and none of them deserves it, regardless of their career choices. Deepfake porn isn’t ‘real’ of course, but it looks it. And that's enough to have real-life consequences for the women who find themselves targeted by this disturbing and exploitative trend.
If you have had your intimate images shared without your consent, remember that you are not alone, and there is help available. Get in touch with the Revenge Porn Helpline at help@revengepornhelpline.org.uk. There is also a step-by-step guide on notyourporn.com, which should be followed before taking any action.
The new laws designed for greater internet safety are set to be approved by the House of Commons.
