Social media is undoubtedly the most preferable platform for people to vent out their frustration and share their thoughts when they are upset. However, from now on, if someone shares something that might indicate that they are having suicidal or self-harmful thoughts, there’s an effective way to help them out.
Facebook recently partnered up with the University of Washington to launch a roll out a new tool, which works as a suicide alert system.
The latest feature allows users to report a Facebook post for content review if they identify it as upsetting or suicidal. The website support team will assess it and then contact suicide prevention organizations if they think the user’s life is really endangered.
"Keeping you safe is our most important responsibility on Facebook," said Rob Boyle of Facebook Inc. "Today, at our fifth Compassion Research Day, we announced updated tools that provide more resources, advice and support to people who may be struggling with suicidal thoughts and their concerned friends and family members."
The firm further said that it had worked with mental health organizations Forefront, Now Matters Now, the National Suicide Prevention Lifeline, Save.org and has consulted with people who had experienced self-injury or suicidal thoughts.
“One of the first things these organizations discussed with us was how much connecting with people who care can help those in distress,” Boyle added. “If someone on Facebook sees a direct threat of suicide, we ask that they contact their local emergency services immediately. We also ask them to report any troubling content to us.”
For now, this feature is only available in the United States, but if it proves to be useful, the social media giant will introduce it worldwide.
Recommended: Who Will Control Your Facebook After You Die?