Ticker

6/recent/ticker-posts

Ad Code

On World Suicide Prevention Day Facebook pledges to prevent self-killing, harm


IN COMMEMORATION of the World Suicide Prevention Day (WSPD) which was observed globally on Tuesday, September 10, Facebook stepped farther in its commitment to keep people safe on all of its platforms.


Facebook’s Global Head of Safety, Antigone Davis, noted that since Facebook’s inception, the platform has worked with experts from around the world to inform users of its policies, practices and products supporting those at risk of suicide or self-injury.

According to reports made available to National Light, Facebook has taken in the past year. The social media communication firm’s senior executive revealed additional actions it plans to take.Davis said that the intention is to keep people safe on its apps, especially those who are most vulnerable.

“Earlier this year, we began hosting regular consultations with experts from around the world to discuss some of the more difficult topics associated with suicide and self-injury. These include how we deal with suicide notes, the risks of sad content online and newsworthy depictions of suicide. Further details of these meetings are available on Facebook’s new Suicide Prevention page in our Safety Center,” she stated.

As a result of these consultations, Davis said, Facebook has made several changes to improve how it handles content, tightened its policy around self-harm to no longer allow graphic cutting images to avoid unintentionally promoting or triggering self-harm, even when someone is seeking support or expressing themselves to aid their recovery.

“On Instagram, we’ve also made it harder to search for this type of content and kept it from being recommended in Explore. We’ve also taken steps to address the complex issue of eating disorder content on our apps by tightening our policy to prohibit additional content that may promote eating disorders.

And with these stricter policies, we’ll continue to send resources to people who post content promoting eating disorders or self-harm, even if we take the content down. Lastly, we chose to display a sensitivity screen over healed self-harm cuts to help avoid unintentionally promoting self-harm,” Davis noted.

She said that Facebook’s engagement with experts has proven so valuable that it is also hiring a health and well-being expert to join its safety policy team.

“And for the first time, we’re also exploring ways to share public data from our platform on how people talk about suicide, beginning with providing academic researchers with access to the social media monitoring tool, CrowdTangle.

To date, CrowdTangle has been available primarily to help newsrooms and media publishers understand what is happening on Facebook. But we are eager to make it available to two select researchers who focus on suicide prevention to explore how information shared on Facebook and Instagram can be used to further advancements in suicide prevention and support,” she added.

On how it intends to make its platforms safer for difficult conversations, Davis said that Facebook has a unique role in facilitating those kinds of connections and it is taking additional steps to support those who are discussing these sensitive topics, especially young people.

“To help young people safely discuss topics like suicide, we’re enhancing our online resources by including Orygen’s #chatsafe guidelines in Facebook’s Safety Center and in resources on Instagram when someone searches for suicide or self-injury content.

“The #chatsafe guidelines were developed together with young people to provide support to those who might be responding to suicide-related content posted by others or for those who might want to share their own feelings and experiences with suicidal thoughts, feelings or behaviors,” she further stated.

She also assured that Facebook will continue to invest in people, technology and resources so that it can do more to protect people on its apps.

.

Reactions

Post a Comment

0 Comments