How The Terrifying DeepNudes App Violated Every Rule About Consent Ever Made

It's since been taken down by the creators because 'the probability that people will misuse it is too high' - which begs the question, what else did they think people were going to use it for???

Deepnudes

by Sofia Tindall |
Updated on

Imagine a world in which you could pay $50 to see any woman you know naked: oh, and without their permission. Sounds pretty invasive doesn't it? Chances are that if you're not posting naked pictures of yourself to the internet of your own free will, it's because you don't want people seeing them. Yet for less than it costs to order a Deliveroo and an Uber after a night out, creating realistic fake nudes of women was the service that the application DeepNudes was offering.

DeepNudes is as sinister a development in the world of technology as it sounds: using artificial intelligence, it enabled users to transform an uploaded picture of a clothed woman into a passably realistic nude complete with naked breasts adjusted the subject's dimensions and a vulva. The end product (which could only be used on women) was a fake nude of whoever the user had uploaded an image of. Are you thinking about how many pictures of you that are floating around on Instagram and Facebook now? Yeah, me too. Even with your privacy settings cranked up - terrifyingly, chances are it wouldn't be hard for someone to access a full length picture of most of us, least of all if they were on your friends or followers list.

Since Vice reported on the app, the creator - who has chosen to remain anonymous but goes by the name Alberto - has taken it down. In a tweet on the deepnudeapp Twitter account, he wrote that 'the probability that people will misuse it is too high' (sort of begging the question: why would you create technology of this nature in the first place? Couldn't the creation of any fake nude of a woman be classed as 'misuse', regardless of what volume of people download the application?)

When motherboard (the tech arm of Vice) attempted to use DeepNudes on a picture of a man, the software superimposed a picture of a vagina onto his groin. So just to be completely clear: this was a software that was specifically designed to be used on women and women alone - for what purposes, we don't like to think about.

The glaring red flag is: why would anyone want to be in possession of a fake nude of a woman - of anyone - if not to violate their privacy in some way? Regardless of whether the produced photos were used for private use, or to be used for blackmail or revenge porn - it oversteps the line of what constitutes sexual consent in mind-boggling ways. In the same way that upskirting or spreading revenge porn is a sex crime because the victim has not consented for their image to be taken or used in that way - the existence of a software that enables almost anyone to own a fake nude of you is a violation of sexual privacy.

When Motherboard tested the app, they found that results in the realism of the fake nudes varied depending on what kind of picture was uploaded - when high resolution pictures of swimwear models in bikinis were used, they produced noticeably more realistic results that images of fully clothed women and results varied depending on skintone and whether or not the subject of the picture was directly facing the camera. Speaking about the algorithm and how it works, the creator Alberto told Vice 'The networks are multiple, because each one has a different task: locate the clothes. Mask the clothes. Speculate anatomical positions. Render it.' in the deepnudesapp Tweet, he went on to state 'to be honest the app's not that great, it only works with particular photos'.

So essentially, to produce a nude that looked realistic the user would need to access a picture of you in a bikini, looking at a camera. Nonetheless, it only took me a quick scroll through my Instagram to find a picture of myself in a bikini that would probably fit DeepNudes criterion for producing a reasonably realistic fake nude. I don't feel as though I should be ashamed of having that picture on my social media account - it's a boundary that I'm happy with, but the thought that up until recently a technology existed that could have overstepped that line is terrifying. Furthermore, like Pandora's box, those algorithms are out there now - and they essentially victimize and endanger women for having photos of them online at all.

Fake or real - there's only one reason that someone should be in possession of a nude picture of you, and that's if you've intended for them to have one.

Just so you know, whilst we may receive a commission or other compensation from the links on this website, we never allow this to influence product selections - read why you should trust us