Last week, a disturbing new app was taken off the market. Called DeepNude, it allowed users to create realistic 'naked' shots of any woman whose photo was uploaded, by replacing her clothes with images of intimate body parts.
That meant that anyone who purchased the app for $50 (£40) could create what appeared to be nude pictures of a woman without her consent (a free version created images just as graphic, but featured the lettering 'FAKE' in one corner). And yes, their app did only work on images of women - the website Motherboard tried to use it on a photo of a man, only to find his trousers replaced with a vulva.
Amid a huge backlash, its anonymous creators pulled the plug, saying they needed to 'fix some bugs and catch our breath'. But while that particular app might be gone, the uproar left in its wake drew attention to a growing problem: the rapid rise of so-called deepfake technology.
Based on artificial intelligence software, 'deepfake' programs allow images of real people to be superimposed on existing images and videos to create super-realistic, though fake, content. From mock footage of politicians saying and doing outrageous things, to reels of counterfeit broadcasts, its potential uses - and misuses - are deeply alarming: 'fake news' on steroids.
Even so, potentially the most devastating impact of this technology lies in what is being called 'deepfake porn'.
Once the preserve of technically able hobbyists, the emergence of deepfake technology has made it easier than ever to take somebody’s face and place it on an entirely different body to create explicit images or video - 'sex tapes' featuring celebrities' faces superimposed on porn performers' bodies already circulate online.
Similarly, the technology means fake - but no less damaging - revenge porn can now be created at the click of button, without the victim having ever taken a naked photo. Sophie Mortimer, manager of the Revenge Porn Helpline, says: 'The harm caused by fake images can be just as damaging when no one can tell the difference: if it looks like you, then to everyone else, it is you.'
Laura Bates, founder of the Everyday Sexism campaign and a leading voice against revenge porn in the UK, told Grazia. 'It is just another form of extreme misogyny, abuse, coercion and a demonstration of power imbalance. It's not taken seriously because it is online and people simply don't join the dots to see this is part of a wider pattern of domestic violence and relationship abuse.'
Governments around the world, meanwhile, have yet to keep pace with technology. The State of Virginia in the United States has banned deepfake revenge porn, amending legislation on the malicious sharing of explicit photos or videos without the victim's consent to include counterfeit items.
Here, revenge porn is a crime punishable by up to two years in prison - but as the legislation was introduced in 2015 under communications law, victims are not afforded the same anonymity as survivors of other sexual offences. And while offenders found guilty will be asked to delete offensive material, the internet means it can often be difficult to eradicate it indefinitely.
The Law Commission, set up to oversee changes to the law, is currently reviewing existing revenge porn legislation for England and Wales - that will, a spokesperson confirmed to Grazia, include looking into legislating against 'deepfake pornography', which is currently excluded from revenge porn law. However, it is not due to report its findings to Parliament until 2021.
According to Mortimer, the damage is already being done. 'We are definitely seeing a rise in contacts from people fearful that images have been created and circulated,' she says.
A small number of similar cases have been prosecuted under harassment in the UK. Last year, Davide Buccheri, 25, was jailed for 16 weeks and ordered to pay £5,000 in compensation after he mocked up indecent images of his victim using Photoshop and uploaded them onto a porn site.
'Changing societal attitudes towards women is just as important as law change,' says Bates. 'Many young people don't even know that [revenge porn] is illegal, and it is a widespread problem in young relationships. There is no use having a law if victims don't feel they can access it and don't believe they will be taken seriously.