Germany acts to tame Facebook, learning from its own history of hate

A country taps its past as it leads the way on one of the most pressing issues facing modern democracies: how to regulate the world’s biggest social network

Katrin Bennhold
Friday 15 June 2018 12:07 BST
Comments
Facebook unveils online dating feature

Security is tight at this brick building on the western edge of Berlin. Inside, a sign warns: “Everybody without a badge is a potential spy!”

Spread over five floors, hundreds of men and women sit in rows of six scanning their computer screens. All have signed nondisclosure agreements. Four trauma specialists are at their disposal seven days a week.

They are the agents of Facebook. And they have the power to decide what is free speech and what is hate speech.

This is a deletion centre, one of Facebook’s largest, with more than 1,200 content moderators. They are cleaning up content – from terrorist propaganda to Nazi symbols to child abuse – that violates the law or the company’s community standards.

Germany, home to a tough new online hate speech law, has become a laboratory for one of the most pressing issues for governments today: how and whether to regulate the world’s biggest social network.

Around the world, Facebook and other social networking platforms are facing a backlash over their failures to safeguard privacy, disinformation campaigns and the digital reach of hate groups.

In India, seven people were beaten to death after a false viral message on the Facebook subsidiary WhatsApp. In Myanmar, violence against the Rohingya minority was fuelled, in part, by misinformation spread on Facebook.

In the United States, congress called Mark Zuckerberg, Facebook’s chief executive, to testify about the company’s inability to protect its users’ privacy.

As the world confronts these rising forces, Europe, and Germany in particular, have emerged as the de facto regulators of the industry, exerting influence beyond their own borders.

Gerd Billen says that data protection is a fundamental right (Alamy)

Berlin’s digital crackdown on hate speech, which took effect on 1 January, is being closely watched by other countries. And German officials are playing a major role behind one of Europe’s most aggressive moves to rein in technology companies, strict data privacy rules that took effect across the European Union last month and are prompting global changes.

“For them, data is the raw material that makes them money,” said Gerd Billen, secretary of state in Germany’s Ministry of Justice and Consumer Protection. “For us, data protection is a fundamental right that underpins our democratic institutions.”

Germany’s troubled history has placed it on the front line of a modern tug-of-war between democracies and digital platforms.

In the country of the Holocaust, the commitment against hate speech is as fierce as the commitment to free speech. Hitler’s Mein Kampf is available only in an annotated version. Swastikas are illegal. Inciting hatred is punishable by up to five years in jail.

But banned posts, pictures and videos have routinely lingered on Facebook and other social media platforms. Now companies that systematically fail to remove “obviously illegal” content within 24 hours face fines of up to €50m.

The deletion centre predates the legislation, but its efforts have taken on new urgency. Every day content moderators in Berlin, hired by a third-party firm and working exclusively on Facebook, pore over thousands of posts flagged by users as upsetting or potentially illegal and make a judgment: Ignore, delete or, in particularly tricky cases, “escalate” to a global team of Facebook lawyers with expertise in German regulation.

Some decisions to delete are easy. Posts about Holocaust denial and genocidal rants against particular groups like refugees are obvious ones for taking down.

Others are less so. On 31 December, the day before the new law took effect, a far-right lawmaker reacted to an Arabic New Year’s tweet from the Cologne police, accusing them of appeasing “barbaric, Muslim, gang-raping groups of men”.

The request to block a screenshot of the lawmaker’s post wound up in the queue of Nils, a 35-year-old agent in the Berlin deletion centre. His judgment was to let it stand.

A colleague thought it should come down. Ultimately, the post was sent to lawyers in Dublin, London, Silicon Valley and Hamburg. By the afternoon it had been deleted, prompting a storm of criticism about the new legislation, known here as the “Facebook Law”.

“A lot of stuff is clear-cut,” Nils said. Facebook, citing his safety, did not allow him to give his surname. “But then there is the borderline stuff.”

Complicated cases have raised concerns that the threat of the new rules’ steep fines and 24-hour window for making decisions encourage “over-blocking” by companies, a sort of defensive censorship of content that is not actually illegal.

The far-right Alternative für Deutschland, a noisy and prolific user of social media, has been quick to proclaim “the end of free speech”. Human rights organisations have warned that the legislation was inspiring authoritarian governments to copy it.

The centre in Berlin has more than 1,200 content moderators helping to clean up illegal content like terrorist propaganda and Nazi symbols (Alamy)

Other people argue that the law simply gives a private company too much authority to decide what constitutes illegal hate speech in a democracy, an argument that Facebook, which favoured voluntary guidelines, made against the law.

“It is perfectly appropriate for the German government to set standards,” said Elliot Schrage, Facebook’s vice-president of communications and public policy. “But we think it’s a bad idea for the German government to outsource the decision of what is lawful and what is not.”

Richard Allan, Facebook’s vice-president for public policy in Europe and the leader of the company’s lobbying effort against the German legislation, put it more simply: “We don’t want to be the arbiters of free speech.”

German officials counter that social media platforms are the arbiters anyway.

It all boils down to one question, said Billen, who helped draw up the new legislation: “Who is sovereign... parliament or Facebook?”

When Nils applied for a job at the deletion centre, the first question the recruiter asked him was: “Do you know what you will see here?”

Nils has seen it all. Child torture. Mutilations. Suicides. Even murder: He once saw a video of a man cutting a heart out of a living human being.

And then there is hate.

“You see all the ugliness of the world here,” Nils said. “Everyone is against everyone else. Everyone is complaining about that other group. And everyone is saying the same horrible things.”

The issue is deeply personal for Nils. He has a 4-year-old daughter. “I’m also doing this for her,” he said.

The centre here is run by Arvato, a German service provider owned by the conglomerate Bertelsmann. The agents have a broad purview, reviewing content from half a dozen countries.

Those with a focus on Germany must know Facebook’s community standards and, as of January, the basics of German hate speech and defamation law.

“Two agents looking at the same post should come up with the same decision,” says Karsten Konig, who manages Arvato’s partnership with Facebook.

The Berlin centre opened with 200 employees in 2015, as Germany was opening its doors to hundreds of thousands of migrants.

That year a selfie went viral.

Anas Modamani, a Syrian refugee, posed with Chancellor Angela Merkel and posted the image on Facebook. It instantly became a symbol of her decision to allow in hundreds of thousands of migrants.

Soon it also became a symbol of the backlash.

The image showed up in false reports linking Modamani to terrorist attacks in Brussels and on a Christmas market in Berlin. He sought an injunction against Facebook to stop such posts from being shared but eventually lost.

Richard Allan, Facebook’s vice president for public policy in Europe, leaving a meeting at Germany’s justice ministry in March (Alamy)

The arrival of nearly 1.4 million migrants in Germany has tested the country’s resolve to keep a tight lid on hate speech. The law on illegal speech was long-established but enforcement in the digital realm was scattershot before the new legislation.

Posts calling refugees rapists, Neanderthals and scum survived for weeks, according to jugendschutz.net, a publicly funded internet safety organisation. Many were never taken down. Researchers at jugendschutz.net reported a tripling in observed hate speech in the second half of 2015.

Billen, the secretary of state in charge of the new law, was alarmed. In September 2015 he convened executives from Facebook and other social media sites at the justice ministry, a building that was once the epicentre of state propaganda for the communist East.

A task force for fighting hate speech was created. A couple of months later, Facebook and other companies signed a joint declaration, promising to “examine flagged content and block or delete the majority of illegal posts within 24 hours”.

But the problem did not go away. Over the 15 months that followed, independent researchers, hired by the government, twice posed as ordinary users and flagged illegal hate speech. During the tests, they found that Facebook had deleted 46 per cent and 39 per cent.

“They knew that they were a platform for criminal behaviour and for calls to commit criminal acts, but they presented themselves to us as a wolf in sheepskin,” Billen said.

By March 2017, the German government had lost patience and started drafting legislation. The Network Enforcement Law was born, setting out 21 types of content that are “manifestly illegal” and requiring social media platforms to act quickly.

Officials say early indications suggest the rules have served their purpose. Facebook’s performance on removing illegal hate speech in Germany rose to 100 per cent over the past year, according to the latest spot check of the European Union.

Platforms must publish biannual reports on their efforts. The first is expected in July.

At Facebook’s Berlin offices, Allan acknowledged that under the earlier voluntary agreement, the company had not acted decisively enough at first.

“It was too little and it was too slow,” he said. But, he added, “that has changed”.

He cited another independent report for the European Commission from last summer that showed Facebook was by then removing 80 per cent of hate speech posts in Germany.

The reason for the improvement was not German legislation, he said, but a voluntary code of conduct with the European Union. Facebook’s results have improved in all European countries, not just in Germany, Allan said.

“There was no need for legislation,” he said.

Billen disagrees.

“They could have prevented the law,” he said. YouTube scored 90 per cent in last year’s monitoring exercise. If other platforms had done the same, there would be no law today, he said.

Germany’s hardline approach to hate speech and data privacy once made it an outlier in Europe. The country’s stance is now more mainstream, an evolution seen in the justice commissioner in Brussels.

Vera Jourova, the justice commissioner, deleted her Facebook account in 2015 because she could not stand the hate anymore.

“It felt good,” she said about pressing the button. She added: “It felt like taking back control.”

But Jourova, who grew up behind the Iron Curtain in what is now the Czech Republic, had long been sceptical about governments legislating any aspect of free speech, including hate speech.

Her father lost his job after making a disparaging comment about the Soviet invasion in 1968, barring her from going to university until she married and took her husband’s name.

“I lived half my life in the atmosphere driven by Soviet propaganda,” she said. “The golden principle was: if you repeat a lie a hundred times it becomes the truth.”

When Germany started considering a law, she instead preferred a voluntary code of conduct. In 2016, platforms like Facebook promised European users easy reporting tools and committed to removing most illegal posts brought to their attention within 24 hours.

The approach worked well enough, Jourova said. It was also the quickest way to act because the 28 member states in the European Union differed so much about whether and how to legislate.

But the stance of many governments towards Facebook has hardened since it emerged that the consulting firm Cambridge Analytica had harvested the personal data of up to 87 million users.

Anas Modamani, a Syrian refugee, took a selfie with Chancellor Angela Merkel in Berlin in September 2015 (Alamy)

Representatives of the European Parliament asked Zuckerberg to come to Brussels to “clarify issues related to the use of personal data” – he did so last month, but left without committing to anything substantial.

Jourova, whose job is to protect the data of more than 500 million Europeans, has hardened her stance as well.

“Our current system relies on trust and this did nothing to improve trust,” she said. “The question now is how do we continue?”

The European Commission is considering German-style legislation for online content related to terrorism, violent extremism and child pornography, including a provision that would include fines for platforms that did not remove illegal content within an hour of being alerted to it.

Several countries – France, Israel, Italy and Canada among them – have sent queries to the German government about the impact of the new hate speech law.

And Germany’s influence is evident in Europe’s new privacy regulation, known as the General Data Protection Regulation (GDPR). The rules give people control over how their information is collected and used.

Inspired in part by German data protection laws written in the 1980s, the regulation has been shaped by a number of prominent Germans. Jourova’s chief of staff, Renate Nikolay, is German, as is her predecessor’s chief of staff, Martin Selmayr, now the European Commission’s secretary-general.

The lawmaker in charge of the regulation in the European Parliament is German, too.

“We have built on the German tradition of data protection as a constitutional right and created the most modern piece of regulation of the digital economy,” Nikolay said.

“To succeed in the long term companies needs the trust of customers,” she said. “At the latest, since Cambridge Analytica it has become clear that data protection is not just some nutty European idea, but a matter of competitiveness.”

On 26 March Jourova wrote a letter – by post, not email – to Sheryl Sandberg, Facebook’s chief operating officer.

“Is there a need for stricter rules for platforms like those that exist for traditional media?” she asked.

“Is the data of Europeans affected by the current scandal?” she added, referring to the Cambridge Analytica episode. And, if so, “How do you plan to inform the user about this?”

She demanded a reply within two weeks, and she got one. Some 2.7 million Europeans were affected, Sandberg wrote. But she never answered Jourova’s question on regulation.

Vera Jourova, the European Union’s justice commissioner, deleted her Facebook account in 2015 because she could no longer stand the hate (Alamy)

“There is now a sense of urgency and the conviction that we are dealing with something very dangerous that may threaten the development of free democracies,” said Jourova, who is also trying to find ways to clamp down on fake news and disinformation campaigns.

“We want the tech giants to respect and follow our legislation,” she added. “We want them to show social responsibility both on data protection and on hate speech.”

So do many Facebook employees, Allan, the company executive, said.

“We employ very thoughtful and principled people,” he said. “They work here because they want to make the world a better place, so when an assumption is made that the product they work on is harming people it is impactful.”

“People have felt this criticism very deeply,” he added.

Nils works eight-hour shifts. On busy days, 1,500 user reports are in his queue. Other days, there are only 300. Some of his colleagues have nightmares about what they see.

Every so often someone breaks down. A mother recently left her desk in tears after watching a video of a child being sexually abused. A young man felt physically sick after seeing a video of a dog being tortured. The agents watch teenagers self-mutilating and girls recounting rape.

They have weekly group sessions with a psychologist and the trauma specialists on standby. In more serious cases, the centre teams up with clinics in Berlin.

In the office, which is adorned with Facebook logos, fresh fruit is at the agents’ disposal in a small room where subdued colours and decorative moss growing on the walls are meant to calm fraying nerves.

To decompress, the agents sometimes report each other’s posts, not because they are controversial, but “just for a laugh,” said another agent, the son of a Lebanese refugee and an Arabic-speaker who has had to deal with content related to terrorism generally and the Islamic State specifically.

By now, he said, images of “weird skin diseases” affected him more than those of a beheading. Nils finds sports injuries like breaking bones particularly disturbing.

There is a camaraderie in the office and a real sense of mission: Nils said the agents were proud to “help clean up the hate”.

© New York Times

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in