Published 12:26 IST, September 18th 2019
Facebook redefines 'terrorist organisations' to prevent extremism
Facebook said its automated systems remove the content glorifying the Islamic State group and al-Qaida before it’s reported. Facebook discussed the steps being
Advertisement
Facebook said its automated systems remove the content glorifying the Islamic State group and al-Qaida before it’s reported. In its recent blog post, Facebook discussed the steps being taken to remove extremist content and prevent extremist organisations from using the platform. In March, the deadly terrorist attack in Christchurch, New Zealand that killed 50 people was live-streamed on Facebook.
What Facebook has done so far
In May, Facebook announced restrictions on who can use Facebook Live feature.
Advertisement
"We also co-developed a nine-point industry plan in partnership with Microsoft, Twitter, Google and Amazon, which outlines the steps we’re taking to address the abuse of technology to spread terrorist content," Facebook said.
Advertisement
Ideology vs behaviour
Facebook said it identified the groups as terrorist organisations based on their behaviour on the platform and not the ideologies.
"We initially focused on global terrorist groups like ISIS and al-Qaeda."
Advertisement
Facebook said it removed over 26 million pieces of content related to ISIS and al-Qaeda in the last two years and 99 per cent of extremist content was removed before it was reported. Facebook also said it banned over 200 groups and organisations related to white supremacy.
Advertisement
Christchurch attack aftermath
"The (Christchurch) attack demonstrated the misuse of technology to spread radical expressions of hate, and highlighted where we needed to improve detection and enforcement against violent extremist content."
Facebook further blamed its inability to detect and stop Christchurch attack live stream on the lack of training data for its automated machine learning systems.
"The video of the attack in Christchurch did not prompt our automatic detection systems because we did not have enough content depicting first-person footage of violent events to effectively train our machine learning technology."
With that said, Facebook is now obtaining camera footage from firearms training programs in the US and the UK to train its systems.
Facebook redefines terrorist organisations
Facebook said it also updated its definition of terrorist organisations.
"While our previous definition focused on acts of violence intended to achieve a political or ideological aim, our new definition more clearly delineates that attempts at violence, particularly when directed toward civilians with the intent to coerce and intimidate, also qualify."
11:33 IST, September 18th 2019