Published 12:26 IST, September 18th 2019

Facebook redefines 'terrorist organisations' to prevent extremism

Facebook said its automated systems remove the content glorifying the Islamic State group and al-Qaida before it’s reported. Facebook discussed the steps being

Reported by: Tanmay Patange
Follow: Google News Icon
  • share
null | Image: self
Advertisement

Facebook said its automated systems remove content glorifying Islamic State group and al-Qaida before it’s reported. In its recent blog post, Facebook discussed steps being taken to remove extremist content and prevent extremist organisations from using platform. In March, deadly terrorist attack in Christchurch, New Zealand that killed 50 people was live-streamed on Facebook.

What Facebook has done so far

In May, Facebook anunced restrictions on who can use Facebook Live feature.

Advertisement

"We also co-developed a nine-point industry plan in partnership with Microsoft, Twitter, Google and Amazon, which outlines steps we’re taking to address abuse of techlogy to spread terrorist content," Facebook said.

Facebook tightens political ad rules ahead of US presidential election

Ideology vs behaviour

Facebook said it identified groups as terrorist organisations based on ir behaviour on platform and t ideologies.

Advertisement

"We initially focused on global terrorist groups like ISIS and al-Qaeda."

Facebook said it removed over 26 million pieces of content related to ISIS and al-Qaeda in last two years and 99 per cent of extremist content was removed before it was reported. Facebook also said it banned over 200 groups and organisations related to white supremacy.

Facebook boss Mark Zuckerberg w blaming US government for spreading of fake news on Facebook

Advertisement

Christchurch attack aftermath

" (Christchurch) attack demonstrated misuse of techlogy to spread radical expressions of hate, and highlighted where we needed to improve detection and enforcement against violent extremist content."

Facebook furr blamed its inability to detect and stop Christchurch attack live stream on lack of training data for its automated machine learning systems.

Facebook described as "morally bankrupt pathological liars" over Christchurch attack live stream

Advertisement

" video of attack in Christchurch did t prompt our automatic detection systems because we did t have eugh content depicting first-person foot of violent events to effectively train our machine learning techlogy."

With that said, Facebook is w obtaining camera foot from firearms training programs in US and UK to train its systems.

Facebook to ban posts on white supremacy, white nationalism and white separatism

Advertisement

Facebook redefines terrorist organisations

Facebook said it also updated its definition of terrorist organisations. 

"While our previous definition focused on acts of violence intended to achieve a political or ideological aim, our new definition more clearly delineates that attempts at violence, particularly when directed toward civilians with intent to coerce and intimidate, also qualify."

11:33 IST, September 18th 2019