Skip to main contentSkip to navigationSkip to navigation
Survivors of Auschwitz behind a barbed wire fence in February 1945
Facebook says it ‘does not welcome local law that stands as an obstacle to an open and connected world’. Composite: Getty Images
Facebook says it ‘does not welcome local law that stands as an obstacle to an open and connected world’. Composite: Getty Images

How Facebook flouts Holocaust denial laws except where it fears being sued

This article is more than 6 years old

Leaked guidelines for moderators also reveal refugees are not protected by hate speech rules because they are a ‘hot topic’

Facebook’s policies on Holocaust denial will come under fresh scrutiny following the leak of documents that show moderators are being told not to remove this content in most of the countries where it is illegal.

The files explain that moderators should take down Holocaust denial material in only four of the 14 countries where it is outlawed, if reported.

One document says the company “does not welcome local law that stands as an obstacle to an open and connected world” and will only consider blocking or hiding Holocaust denial messages and photographs if “we face the risk of getting blocked in a country or a legal risk”.

A picture of a concentration camp with the caption “Never again Believe the Lies” was permissible if posted anywhere other than the four countries in which Facebook fears legal action, one document explains. Facebook contested the figures but declined to elaborate.

The social media service has also decided that migrants, refugees and asylum seekers should be regarded as a “quasi-protected category” – so they will not receive the protections given to other vulnerable groups.

Documents show Facebook has told moderators to remove dehumanising speech or any “calls for violence” against refugees. Content “that says migrants should face a firing squad or compares them to animals, criminals or filth” also violate its guidelines.

But it adds: “As a quasi-protected category, they will not have the full protections of our hate speech policy because we want to allow people to have broad discussions on migrants and immigration which is a hot topic in upcoming elections.”

According to the documents, comments permitted under the policy include ones such as: “Islam is a religion of hate. Close the borders to immigrating Muslims until we figure out what the hell is going on”; “migrants are so filthy”; “migrants are thieves and robbers”; and “Mexican immigrants are freeloaders mooching off of tax dollars we don’t even have”.

The documents show moderators have been told they do not have to delete comment such as “Fuck immigrant” and “Keep the horny migrant teenagers away from our daughters”.

However, it is a violation of the rules on migrants to “equate them to other types of criminals, eg rapists, child molesters, murderers or terrorists”.

The definitions are set out in training manuals provided by Facebook to the teams of moderators who review material that has been flagged by users of the social media service.

The documents explain the rules and guidelines the company applies to hate speech and “locally illegal content”, with particular reference to Holocaust denial.

One 16-page training manual explains Facebook will only hide or remove Holocaust denial content in four countries – France, Germany, Israel and Austria. The document says this is not on grounds of taste, but because the company fears it might get sued.

“We believe our geo-blocking policy balances our belief in free expression with the practical need to respect local laws in certain sovereign nations in order to remain unblocked and avoid legal liability. We will only use geo-blocking when a country has taken sufficient steps to demonstrate that the local legislation permits censorship in that specific case,” it says.

“Some 14 countries have legislation on their books prohibiting the expression of claims that the volume of death and severity of the Holocaust is overestimated. Less than half the countries with these laws actually pursue it. We block on report only in those countries that actively pursue the issue with us.”

Facebook said the number of countries set out in its documents “is not accurate” – but repeatedly declined to say anything more.

Monika Bickert, ‎ head of global policy management at Facebook, said: “Not every team of employees is involved in enforcing our policies around locally illegal content. Whether reported by government entities or individual users, we remove content that violates our community standards.”

Facebook said it recognised “the sensitivities around the issue of Holocaust denial in Germany and other countries and [we] have made sure that our reviewers are trained to be respectful of that sensitivity.”

Facebook has given certain people “protected category” status when it comes to “hate speech” – telling moderators to delete content relating to them.

One example of a protected category. Photograph: Guardian
And non-protected categories. Photograph: Guardian

The files explain that “countries are not protected … people from a country are protected”. Followers of a particular religion are also protected, “not the religion itself”, the document states.

Groups that are not protected from hate speech include politicians from all parties, and people who are “blonde, brunette, short, tall, fat and thin”.

One slide explains that it is permissible to say: “All terrorists are Muslims”, but it is not permitted to say: “All Muslims are terrorists.” Facebook explains that terrorists are not a protected category, whereas Muslims are – which is why the first remark can be ignored and the second should be deleted if flagged.

Other comments that flout Facebook’s guidelines include “French girls are stupid” and “Irish are stupid”. But moderators are told to ignore “Blonde women are stupid” and “Redheads are disgusting”. According to the documents, Facebook tells moderators to err on the side of allowing content if they are unsure.

“When context is ambiguous about whether a PC (protected category) or non-PC is being attacked, the default action is for reps to ignore,” one slide says. It uses an example involving a photograph of Syrian refugees surrounding children in a swimming pool. The caption to the picture reads: “The scum need to be eliminated.” Facebook says this comment should not be deleted if flagged.

“Because it is ambiguous whether the caption is attacking Syrian refugees (PC) or perpetrators of sexual assault (OR the subcategory Syrian refugees who commit sexual assault), the correct action is to ignore.”

In November, it was reported that Facebook was working on censorship tools to entice China to allow it back into the country.

In a report published earlier this month, British MPs said it was “shockingly easy to find examples of material that was intended to stir up hatred against ethnic minorities on all three of the social media platforms that we examined – YouTube, Twitter and Facebook”.

They added: “On Facebook we found community pages devoted to stirring up hatred, particularly against Jews and Muslims. We found openly antisemitic and Islamophobic community pages.”

The Facebook documents on Holocaust denial highlight the company’s complicated relationship with state censorship.

The company has been criticised for its willingness to comply with censorship demands from the governments of Turkey, India and Pakistan, which account for the majority of Facebook’s government-requested takedowns.

Most viewed

Most viewed