A month after a teen takes her life in front of a Facebook Live Online Suicide – one of at least three people has done the same thing over the past few months – the social media giant is introducing new tools and adding the help of artificial intelligence detection in the hope of detecting calls for help in time to do something about it.
The option to “report” suicidal or self-damaging content in someone’s status or in a video is nothing new on the website, but from this month’s live streaming service will put front-page support options.
On Facebook Live, someone whose entry has been reported will see their screen blocked partly by a message from the company that reads: “Someone thinks you might need additional support right now and asked us to help.”
Experts from suicide prevention organizations helped Facebook refine these help tools.
Jennifer Guadagno, a leading investigator for Facebook for suicide prevention, told the Conspiracy Talk – CTN News on Tuesday that the organizations warned the company to be very careful about cutting off live video too soon, even if someone was threatening to commit suicide. Real-time chat provides an opportunity for friends and family of the person detect trouble and provide support and relate to it.
According to CTN news reports, this tool has saved a handful of people in the same situation: witnesses are reported to have called police and detained suicide attempts in Minnesota, Arizona, Bangkok, Hong Kong and Ohio.
But for others, such as Naika Venant, a 14-year-old girl in state custody, help came too late. For situations like that, Guadagno said that “Facebook developed partnerships with organizations such as Crisis Text Line so that people who are in this type situation can immediately chat with a mental health professional directly in the Facebook messaging application” in any moment.
These tools are extensions of resources that were previously available on the website and on the Internet as part of an effort to “reduce friction” between people with suicidal impulses and provide services designed to help them.
“We are trying to contact people where they are,” Guadagno said.
The new idea of Facebook is based on all the status and videos previously reported as suicidal (and in the comments on them). The website will use software for pattern recognition to alert community watchers of content that may be related to suicide or self-harm.
“We’ll also use some artificial intelligence,” said Vanessa Callison-Burch, product manager of Facebook’s new tools. “We are hopeful that the combination of technology and connections with friends and family can help people.”