A Google Algorithm Seems To Think Brands Like Boohoo And Missguided Are Pretty ‘Racy’

An investigation into how modest fast-fashion brands are has revealed intriguing information about the way Google rates images.

Group of women modelling swimwear

by Georgia Aspinall |
Updated on

An investigation into the ‘raciest’ clothing from different fashion brands has highlighted the fact that Google uses software to rate imagery as part of a ‘safe search’ tool and scores clothing based on how ‘skimpy or sheer’ it is.

Google’s safe search tool detects adult, spoof, medical, violent and ‘racy’ images in order to protect child users from seeing explicit content, or adults from it while at work. While the search tool is largely responsible, the way in which the software determines ‘racy’ images is seemingly quite problematic.

The BBC launched an investigation into how modest clothing is from high-street fashion brands compared to fast fashion, finding that fast-fashion brands, like Missguided, Pretty Little Thing, I Saw It First, Boohoo and Nasty Gal, have double the amount of ‘racy’ images on their websites than high-street brands do.

According to Google’s software, 8% of women’s modelling images on high-street websites were ‘racy’, compared to 16% of fast-fashion sites. Of the companies analysed (including high-street brands like Topshop, River Island, New Look and Urban Outfitters), seven had men’s sites where only 2% of the images were classed as ‘racy’.

'Our website reflects what appeals to the young women who love to buy from us - sassy, empowered, unafraid of what others think,' a representative for Missguided told the BBC. 'We run our website for them, not an artificial intelligence algorithm.'

The software doesn’t just identify raciness by clothing, but also by the way in which the models pose. So, if a model is deemed to be posing ‘seductively’, the image will receive a higher raciness score. Naturally, swimwear or underwear on a site also affects the overall score.

So apparently Google's software clearly equates nudity with sex. Despite the fact women wear less clothing in many non-sexual contexts – you know, bathing, swimming, at the doctor's – this algorithm seems to only see nudity in a sexual sense.

Ultimately, other than this investigation proving entirely reductive – did we really need data on how fast fashion brands sell more sheer clothing?! – it also proves that Google needs to update its software to get with the programme. Pun intended.

Read More:

When Will Men Learn That Women Don’t Dress For Their Approval, Married Or Not?

Could This Fast Fashion Tax Save The World?

Can You Call Yourself A Feminist If You Buy Fast Fashion?

Just so you know, whilst we may receive a commission or other compensation from the links on this website, we never allow this to influence product selections - read why you should trust us