Published 12:59 IST, September 18th 2019

Facebook auto-generating pages for Islamic State, al-Qaida

A whistleblower’s complaint shows that Facebook itself has inadvertently provided the two extremist groups with a networking and recruitment tool by producing d

Follow: Google News Icon
  • share
null | Image: self
Advertisement

In face of criticism that Facebook is t doing eugh to combat extremist messaging, company likes to say that its automated systems remove vast majority of prohibited content glorifying Islamic State group and al-Qaida before it’s reported. But a whistleblower’s complaint shows that Facebook itself has invertently provided two extremist groups with a networking and recruitment tool by producing dozens of ps in ir names. social networking company appears to have me little progress on issue in four months since Associated Press detailed how ps that Facebook auto-generates for businesses are aiding Middle East extremists and white supremacists in United States. On Wednesday, U.S. senators on Committee on Commerce, Science, and Transportation will be questioning representatives from social media companies, including Monika Bickert, who hes Facebooks efforts to stem extremist messaging.

Almost 200 auto-generated ps identified

new details come from an update of a complaint to Securities and Exchange Commission that National Whistleblower Center plans to file this week. filing obtained by AP identifies almost 200 auto-generated ps — some for businesses, ors for schools or or categories — that directly reference Islamic State group and dozens more representing al-Qaida and or kwn groups. One p listed as a “political ideology” is titled “I love Islamic state.” It features an IS logo inside outlines of Facebook’s famous thumbs-up icon. In response to a request for comment, a Facebook spokesperson told AP: “Our priority is detecting and removing content posted by people that violates our policy against dangerous individuals and organisations to stay ahe of b actors. Auto-generated ps are t like rmal Facebook ps as people can’t comment or post on m and we remove any that violate our policies. While we cant catch every one, we remain vigilant in this effort.”

Advertisement

Facebook redefines 'terrorist organisations' to prevent extremism

Facebook has a number of functions that auto-generate ps from content posted by users. updated complaint scrutinises one function that is meant to help business networking. It scrapes employment information from users’ ps to create ps for businesses. In this case, it may be helping extremist groups because it allows users to like ps, potentially providing a list of sympathisers for recruiters. new filing also found that users’ ps promoting extremist groups remain easy to find with simple searches using ir names. y uncovered one p for “Mohammed Atta” with an iconic photo of one of al-Qaida herents, who was a hijacker in Sept. 11 attacks. p lists user’s work as “Al Qaidah” and education as “University Master Bin Len” and “School Terrorist Afghanistan.” Facebook has been working to limit spre of extremist material on its service, so far with mixed success.

Advertisement

Facebook to ban posts on white supremacy, white nationalism and white separatism

Facebook uses AI to fight extremism

In March, it expanded its definition of prohibited content to include U.S. white nationalist and white separatist material as well as that from international extremist groups. It says it has banned 200 white supremacist organisations and 26 million pieces of content related to global extremist groups like IS and al-Qaida. It also expanded its definition of terrorism to include t just acts of violence attended to achieve a political or ideological aim, but also attempts at violence, especially when aimed at civilians with intent to coerce and intimidate. It’s unclear, though, how well enforcement works if company is still having trouble ridding its platform of well-kwn extremist organisations’ supporters. But as report shows, plenty of material gets through cracks — and gets auto-generated. AP story in May highlighted auto-generation problem, but new content identified in report suggests that Facebook has t solved it.

Advertisement

Facebook tightens political rules ahe of US presidential election

report also says that researchers found that many of ps referenced in AP report were removed more than six weeks later on June 25, day before Bickert was questioned for ar congressional hearing. issue was flagged in initial SEC complaint filed by centre’s executive director, John Kostyack, that alleges social media company has exaggerated its success combatting extremist messaging.

Advertisement

“Facebook would like us to believe that its magical algorithms are somehow scrubbing its website of extremist content,” Kostyack said. “Yet those very same algorithms are auto-generating ps with titles like ‘I Love Islamic State,’ which are ideal for terrorists to use for networking and recruiting.”

12:21 IST, September 18th 2019