Facebook Upgrading AI To Help Save Lives of Suicidal People

Updated on

Facebook Artificial Intelligence (AI) will get smarter in detecting posts that indicate suicidal thoughts.

In a blog post on Monday, CEO Mark Zuckerberg announced that they are upgrading the AI tools “to identify when someone is expressing thoughts about suicide on Facebook so we can help get them the support they need quickly.”

Facebook Artificial Intelligence to help prevent suicides

Details about the program are thin as of now, but Facebook did mention that the software would search for certain phrases such as the questions “Are you Ok?” and “Can I help?” Once the AI recognizes the pattern, the social networking site will send mental health resources to the user in need, or contact the local first-responders. Facebook’s new Artificial Intelligence technology is said to have an accuracy rate of 80 to 90% in predicting if a person is turning suicidal.

Facebook Vice President of product management, Guy Rosen said, “This is about shaving off minutes at every single step of the process, especially in Facebook Live.”

Over the past month or so, Facebook has done over 100 “wellness checks” with first responders visiting affected users, Rosen noted, adding that in some instances, the first responders arrived as the person was still broadcasting, according to TechCrunch.

Further, Rosen stated that on Facebook, friends and family are connected to each other, and this helps the social networking site reach out to a person in distress and connect them with people who can support them.

“It’s part of our ongoing effort to help build a safe community on and off Facebook,” Rosen said.

In addition to detecting the suicidal post pattern recognition, the Menlo Park California-based company is also deploying AI to rank the reports on the basis of urgency, thus helping them to decide the cases that require attention first.

Doing responsible work

Facebook Artificial Intelligence software for the prevention of suicide is based on the platform’s suicide and self-harm reporting tools unveiled in 2016. These support tools help users flag posts that hint suicide and self-harm. Such flagged posts are reviewed by Facebook’s special team, whose members are trained to identify suicidal thoughts and self-harm behaviors.

Nancy Salamy, the executive director of Crisis Support Services of Alameda County, referred to Facebook as a “responsible partner,’” adding that “they work with crisis centers, and their intention is to try to help somebody on the platform. They want to get [suicidal users] some help.”

Until now, the tools were available only in the U.S., but for the first time, Facebook has decided to go beyond its home country, and make the tools (including the new one) available globally.

Suicide cases are growing exponentially, and according to the World Health Organization (WHO), there is a suicide somewhere on earth every 40 seconds. The severity of the issue is evident from the fact that suicide is the second leading cause of death for people aged between 15 and 29. So, Facebook has quite a task on its hands, especially considering that at least one in five of the billions of videos broadcast on Facebook is live, notes FastCompany.

Leave a Comment