If you, like some of us at The Debrief, spent a hungover day this weekend on your Facebook timeline searching for solace, you’ll know that it can make for a pretty depressing browse-sesh, seeing mates go for roasts, sunning themselves on holiday or generally doing anything that isn't spending the day trying to rid themselves of self-inflicted headache and nausea. But did you know that Facebook – as in the company – sometimes wants you to feel depressed?
It’s emerged that the company’s been using adaptations in the algorithm of the newsfeed to manipulate users’ feelings. It’s such an infringement of users’ privacy that Jim Sheridan, an MP who sits on the Commons’ media select committee, along with various privacy groups, has called for the company to be investigated.
‘This is extraordinarily powerful stuff and if there is not already legislation on this, then there should be to protect people,’ he said overnight.
The experiment, conducted along with researchers from Cornell University and the University of California, took 689,000 users’ home pages and had them tinkered with to make them feel more positive or negative through something they call ‘emotional contagion’.
In one test, people were exposed to more ‘positive emotional content’ from their friends – all of those #blessedsunday ones and ‘wooo I got the job’ were included here, we’re guessing – and in another, people were exposed to more ‘negative emotional content’ such as photos of pigeons eating KFC off of the pavement, or ‘everything’s going wrong at the moment x’ cries for help.
The study concluded that we pick up our emotions from other people. ‘Emotions expressed by friends, via online social networks, influence our own moods, constituting, to our knowledge, the first experimental evidence for massive-scale emotional contagion via social networks.’
Well, duh. This is the most obvious answer to a scientific study since that one asking if men watched porn. (100 per cent of them said ‘yes’).
Why did Facebook run the experiment, then? Well, according to a spokeswoman’s comments made to US journal Proceedings of the National Academy of Sciences, it was ‘to improve our services and to make the content people see on Facebook as relevant and engaging as possible. A big part of this is understanding how people respond to different types of content, whether it’s positive or negative in tone, news from friends, or information from pages they follow.’
Clue: don’t mess with your users’ heads. As the academic who edited the study has pointed out in The Guardian, people should be told if they’re going to be used as guinea pigs. ‘People are supposed to be told they are going to be participants in research and then agree to it and have the option not to agree to it without penalty,’ said Susan Fiske, a Princeton academic.
Between this and everybody’s mums muscling in on the fun of Facebook, maybe it’s worth considering a little purge of the site… until our next hangover, at least.
Follow Sophie on Twitter @sophwilkinson
This article originally appeared on The Debrief.