Kate Isaacs was scrolling through her Twitter account when a video popped up amid her messages and notifications. As soon as she clicked play, she felt sick. The film was a pornographic one, featuring a woman in the middle of a sex act. The woman, she realised, was her.
The punch that hit her felt physical. ‘I remember feeling hot, having this wave come over me. My stomach dropped. I couldn’t even think straight. I was going ‘Where was this? Has someone filmed this without me knowing? Why can’t I remember this? Who is this man?’ ‘
Her mind was such a whirl that Kate, 30, didn’t immediately realise the obvious — it was not her in the footage; not her body anyway. Someone had taken her face and digitally superimposed it onto the body of a porn star to make it look as if it was Kate in the explicit video.
‘But it was so convincing, it even took me a few minutes to realise that it wasn’t me,’ Kate says. ‘Anyone who knew me would think the same. It was devastating. I felt violated, and it was out there for everyone to see.’
Welcome to the world of so-called ‘deepfake’ porn. All it takes for someone to turn you into a porn star is for them to have access to your image. Best results are achieved with moving footage, meaning an innocent video posted on your Instagram account, say from your cousin’s wedding, would be ideal.
Even better is a few minutes lifted from an office Zoom meeting where your face is clear and animated. Deepfakers can upload this to a program that uses artificial intelligence software — the same you might use to put your face into a film clip for fun — to transplant your facial likeness onto the body of someone in an existing porn clip.
Kate never did find out who made the offending porn video of her, but believes she was specifically targeted because she had spoken out about the rise of ‘non-consensual porn’ in the past.
As the founder of a successful campaign called NotYourPorn, launched in 2019, she had been outspoken about the issue of images of women, often nude pictures, being used for pornographic purposes without consent.
Yet the fake porn video of her, which appeared in 2020, had been made using entirely innocent footage taken from the internet.
‘This is all it takes now,’ she points out. ‘It makes every woman who has an image of herself online a potential victim; that’s pretty much all of us, these days.’
Read More: A porn clip popped up on my phone with ME as the star – I felt utterly violated