Facebook has been coming under fire for the past few months, and it has something to do with fake news. Along with Brexit and Trump came news articles trending on the social media site that may or may not have been true. Facebook founder, Mark Zuckerberg faced allegations that Facebook helped to sway the U.S election, which he denied, stating that he thought the problem was being overstated. But now the social media giants have announced that they are taking control. As of today, Facebook will roll out a new policy to tackle fake news. Here’s what we know so far.
Vice President of product management for Facebook’s news feed, Adam Mosseri, stated on their blog: 'We believe in giving people a voice, and that we cannot become arbiters of truth ourselves, so we’re approaching this problem carefully.'
He goes on to talk about how Facebook will be 'focused on the worst of the worst, on the clear hoaxes spread by spammers for their own gain, andon engaging both our community and third party organisations.' This will be split into four areas, which are as follows.
1. Easier reporting
It’s all very well writing a story and uploading it to Facebook, actually it’s super easy. We do it countless times throughout a day, and so do people generating fake news for easy clicks. Facebook will be testing out ways to make it easier to actually report those hoaxes. So if you come across something you think might be fake you’ll be able to click on the upper right hand corner of the post to let the news team know.
2. Informed Sharing
Informed Sharing will look at what the Facebook community is telling them – so it will keep a close on how we are consuming our news. Think about this – you might read a story but you may not share it. Facebook sees this as the story may have misled you in some way, so it will be more likely to rank it as fake news. I’m just going to call bullshit on this – because generally speaking just because you don’t share something, it doesn’t mean you think it’s fake. You may read a really opinionated article on Brexit, for example which really resonates something in you but there’s no way you’d share that on your feed because you think you'd be be slammed by your Facebook friends. So sorry FaceyB, we’re not keen for this one.
3. Flagging Stories as Disputed
The logic behind this one is that by providing more content, Facebook can let people decide for themselves what to trust, and what to share. They will start a program to work with a third-party fact checking organisation. By using those reports from the Facebook community, they will be able to identify if a story is fake. It will then be flagged as disputed, and a link to an article explaining why it’s fake will appear. These stories that are flagged as fake will also appear lower down in your News Feed. You can still share these stories, but you’ll see a warning next to the story.
4. Disrupting Financial Incentives for Spammers.
Fake news is there to make money – think of fake news sites, they pass themselves off as a news organisation and post hoaxes to get people to click through. Clickbait hello. Most of these stories are ads, so by clicking they get serious cash. Facebook wants to stop this. They will be eliminating the ability to spoof domains, which apparently prevents sites passing themselves off as real publications. They will also be analysing publisher sites to detect where policy actions might be necessary.
I mean, most of the above sounds pretty great, and it looks like they want to tackle this issue head on – if it all goes to plan anyway. It doesn't really affect your feed massively, except hopefully there will be fake or misleading news floating around. But apart from that, and the option to report things you think are fake, your news feed will be mostly the same. Here’s to 2017 and less fake news. PLEASE.
Like this? You might also be interested in…
College Students Solved The Facebook Fake News Problem In 36 Hours
Follow Alyss on Instagram @alyssbowen
This article originally appeared on The Debrief.