Published 18:48 IST, September 11th 2019

Facebook tightens policies to prevent self-harm, suicide images

Facebook is preventing self-harm and suicide content from spreading on the platform. Facebook said it had made several changes to the way content related to sel

Reported by: Tech Desk
Follow: Google News Icon
  • share
null | Image: self
Advertisement

Facebook is preventing self-harm and suicide content from spreading on platform. Facebook said it had made several changes to way content related to self-harm and suicide is being handled by company so far. Facebook also said that it has been working with experts around world to address issues and how those issues affect users interacting with suicide-related content on Facebook. Apart from material related to suicide and self-harm, Facebook is also preventing content related to an eating disorder. Meanwhile, Facebook continues to show a sensitivity screen over inappropriate content to help avoid promoting self-harm.

Facebook apparently facing newer investigations, after record fine

Advertisement

Facebook statement

"We tightened our policy around self-harm to longer allow graphic cutting ims to avoid unintentionally promoting or triggering self-harm, even when someone is seeking support or expressing mselves to aid ir recovery. On Instagram, we’ve also made it harder to search for this type of content and kept it from being recommended in Explore," said Antigone Davis, Global Head of Safety of Facebook.

Facebook tightens political ad rules ahead of US presidential election

What does it mean?

As a result of changes made to its policies on suicide content, Facebook will longer allow graphic ims of self-harm on its platform. anuncement comes amid criticism of how social media companies moderate violent and potentially dangerous content online. That said, Facebook is also making it harder to search for self-harm related content on Instagram. Facebook will ensure that it does t appear as recommended in Explore section on Instagram. Facebook' statement comes on World Suicide Prevention Day.

Advertisement

Local government authorities to send out emergency alerts via Facebook

Facebook has a team of moderators to monitor content such as live streaming violent acts as well as suicides. Facebook has been working with at least five third-party companies in at least eight countries on content review, according to Reuters. Governments around world have been working to improve control over dangerous content on social media websites, as well as spread of online porgraphy and election propaganda.

Advertisement

Facebook to hire senior journalists in an attempt to curb fake news

18:27 IST, September 11th 2019