Facebook has released a new feature so naturally I’m going to discuss it. I’ve already discussed the somewhat morbid feature that authorizes a user to control your Facebook profile after you die. I also discussed the partnership Facebook created with the National Center for Missing & Exploited Children (NCMEC) so that the platform can post Amber Alerts to users close to the affected area.
Facebook launched a feature this past week that is aimed at suicide prevention in a partnership with National Suicide Prevention Lifeline. If one of your Facebook friends posts something that seems like its suicidal, you can report it to Facebook. The post will then be reviewed by a third-party who will then decide whether or not to reach out to the user.
If the user needs to be contacted, they will be met with this popup.
Only the user will see this popup and no one else will know that his or her post was flagged. This is especially important because privacy is important, especially during a potentially difficult time.
The user will see this popup next.
The user will have the option of seeking help from helplines or from a friend. The user also has the option to skip this altogether, in the chance that they are not feeling suicidal.
I think this is a great move by Facebook because it’s tough watching people post worrisome things on Facebook and not feeling close enough to them to reach out. I think the fact that the reporting feature is anonymous will encourage people to use it: the friend won’t see that you flagged his or her post so they can’t be upset with you.
About the launch of the new feature, Facebook Safety’s Facebook page had a post that said, “Besides encouraging them to connect with a mental health expert at the National Suicide Prevention Lifeline, we now also give them the option of reaching out to a friend, and provide tips and advice on how they can work through these feelings. All of these resources were created in conjunction with our clinical and academic partners.”
Since feature was created with experts in the medical field, I think that it’ll have the resources that those struggling will need. Whether or not they will be willing to take the help is another issue in itself.
That being said, I think the success of the feature is dependent on whether or not users flag potentially suicidal posts and whether or not the users that really need help accept it.
If you see someone posting anything suicidal on Facebook, be sure to use this feature and help out a friend.