It’s Not Illegal For Someone To Steal Your Images And Turn Them Into Porn

Incredibly, using tech to create porn featuring your stolen image is not a sex crime. Georgia Aspinall reports on how MP Maria Miller hopes to change that.

deepfake

by Georgia Aspinall |
Updated on

Digital sexual abuse is on the rise – and perpetrators are only getting more creative in the ways they violate women’s safety. Alongside revenge porn, where explicit images or videos of a person are posted online without their consent, women are now falling victim to an increase in ‘deepfake’ pornography.

Deepfakes are digitally altered content where a person’s face is superimposed on to another’s body, typically with the intention of harming their reputation. Deepfake pornography sees perpetrators merge their victim’s face with existing pornographic content – and then post it online.

You don’t need to be particularly tech-savvy to do it, you just download an app. Nudify, an app with more than 100,000 downloads, helps you ‘edit photos with naked effect’, while DeepSukebe, a website that had over 38 million hits by August 2021, promises to ‘nudify everything’ in a picture with the click of a button.

‘Any woman can fall victim to it when this kind of software exists,’ Maria Miller, Conservative MP for Basingstoke, told Grazia. ‘There is a patchwork of laws out there, including the Malicious Communications Act, but most police don’t know how to apply it because training on how to identify deepfakes is very inconsistent.’

According to artificial intelligence (AI) research group DeepTrace, the number of deepfake videos online rose by almost 100% from 2018-19 – and that was before revenge porn statistics skyrocketed during lockdown. In 2020 alone, calls to the Revenge Porn Helpline increased by 87%. Deepfake pornography overwhelmingly targets women and the material is often violent, including depictions of rape. It means victims can be left traumatised, forced to witness hauntingly realistic videos of themselves being abused while knowing countless others can see it too.

Maria is therefore calling for the creation and sharing of deepfake pornography without consent to be made a sex crime in the UK – as well as the use of ‘nudification software’. In a debate in Parliament this month, she urged the Government to ‘recognise technology and AI is being used to inflict sexual attacks on women and girls’.

It’s certainly a problem model Vogue Williams recognises. She spoke out about falling victim to deepfake pornography last month. ‘I saw my face plastered on all these porn images and I was like, “Jesus”,’ she told Angela Scanlon on RTÉ’s chat show Ask Me Anything. ‘Do you know what the weirdest thing is? That you can’t get them down… anyone is allowed to put them up and they are starting to look more realistic.’

Helen Mort found out from an acquaintance that deepfake images of her were circulating on a porn website. Pictures were being taken from her social media, including some of her pregnant, and uploaded on to the site where users were encouraged to merge her face on to explicit and violent videos. In fact, they had been online for years and she still has no idea who started it.

‘The underlying feeling was shock and actually I felt quite ashamed, as if I’d done something wrong. Then for a while I got incredibly anxious about even leaving the house,’ Helen says. She still has nightmares about them now.

Alongside changing the law, campaign group My Image My Choice is also calling for better education on revenge porn among students, as well as funding for victims. Its petition at Change.org currently has almost 50,000 signatures.

According to Maria, the Government ‘clearly understands it’s a growing problem’ but the response she’s received from ministers shows they don’t know how to move forward. ‘The Law Commission has made recommendations,’ she explains. ‘But we need to speed up and enact them so we don’t leave every woman in this country worried this could happen to them.’

Just so you know, whilst we may receive a commission or other compensation from the links on this website, we never allow this to influence product selections - read why you should trust us