Data Science Investment Counter —Funding raised by UK data science companies in 2018.
£ 5.640 Million

Facebook is Using AI to Uproot Terrorist Content

Facebook has revealed its approach to removing terrorist-related content, as the social network faces increasing criticism over tackling the bane of extremism on its platform.

In a detailed post published on its blog, the company explained its formula to tackle terror-related material, using a blend of human supervision and artificial intelligence.

“We want Facebook to be a hostile place for terrorists,” the post reads. “The challenge for online communities is the same as it is for real world communities – to get better at spotting the early signals before it’s too late.”

Artificial intelligence seems to be a key part of Facebook’s strategy. AI is mainly used to automatically recognise violent or extremist images and videos, to spot alarming language on users profiles and pages, and to prevent users banned for extremist activities from rejoining Facebook with new fake accounts.

“We are currently focusing our most cutting edge techniques to combat terrorist content about ISIS, Al Qaeda and their affiliates, and we expect to expand to other terrorist organizations in due course,” Facebook wrote.

Besides using AI, the Menlo Park-based company says that it has deployed over 150 people to deal with suspicious activities online, that it is operating at a cross-platform level (i.e.: the same counter-terrorism measures apply on Facebook subsidiaries WhatsApp and Instagram), and that it is working with multiple partners in the government and in tech to organise a coordinated response.

The blog post seems to be Facebook’s implicit answer to recent accusations from British PM Theresa May and French President Emmanuel Macron— both of whom went as far as threatening to fine technology companies that do not take adequate measures to nip terrorist activities in the bud.

Image via


Co-working space and blog dedicated to all things data science.

Subscribe to our newsletter