Good News, Facebook Rolls Out New Suicide Prevention Software

The social media site is going to look out a little bit harder for vulnerable users…

Eylul-aslan

by Sophie Wilkinson |
Published on

Facebook isn’t just a place to offload links to videos, opinions on pop stars, and memes of politicians. For so many, it’s a place where they can talk about their emotions. And sadly, some of the emotions and feelings experienced are really terribly sad – some people will even use Facebook to share that they’re feeling suicidal.

As grim as this is, on the plus side, Facebook is stepping up the way it deals with these sorts of posts.

Rob Boyle, a product manager, and Nicole Staubli, a safety specialist, wrote a Facebook post to explain that, if a person flags up that a friend is posting messages that look suicidal, that friend will then have their social media reviewed by a specially trained team from Facebook.

And then if necessary, the team will post the user notifications with links to suicide prevention resources.

And the person who first reported their suicidal-seeming friend? They’ll get a message telling them to call or message their friend, or to seek the help of a trained professional.

Sounds pretty great, doesn’t it? We all hear of Facebook’s algorithms getting things a tad wrong, but when it’s actual trained humans behind the scheme, it’s got to be a bit more impressive, right?

This new system is an improvement on the previous one, which was introduced in 2011. In that one, you’d have to report mates who were sending out suicidal-seeming messages via uploading links and screenshots to an official Facebook suicide prevention page, reports Time.

Plus, this new system was made in co-ordination with loads of American suicide prevention charities, although Facebook definitely mentioned that the service wouldn’t be a replacement for local emergency services.

UPDATE: As of February 19th 2016, Facebook have announced that they are now rolling out a new suicide prevention tool in the UK. This is an updated version of the system we told you about last year, which was trialied in Australia and the US. They have worked in collaboration with the Samaritans which will allow users to report posts that they are worried about in a much more direct way.

Any posts which are flagged up as troubling will be dealt with by a team working around the clock to review posts. When a post is reported to Facebook as being of concern a private message is also sent directly to the user who posted it offering them different support options - they will be enocuraged to connect with Samaritans but also whether they want to connect with a friend.

Facebook stress that anyone who is concerned by a post which may suggest that someone is having suicidal thoughts should go straight to emergency services first of all, but they hope that this extended support will help users and remind people that they are not alone.

You might also be interested in:

British Teens Die In Suicide Pact Because They Thought Their Families Wouldn’t Accept Them

Pro-Self Harm Hashtags Are On The Rise, But Regulating Them Isn’t Always The Answer

I Became An Accidental Thinspiration Sharer

Follow Sophie on Twitter @sophwilkinson

Picture: Eylul Aslan

This article originally appeared on The Debrief.

Just so you know, we may receive a commission or other compensation from the links on this website - read why you should trust us