Facebook removes ‘napalm girl’ picture: How one Norwegian status update triggered worldwide censorship debate
The dispute isn’t just about one picture, for the people having it, but the entire media
Your support helps us to tell the story
From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.
At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.
The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.
Your support makes all the difference.It began with a Facebook post. It soon became an international argument between one of the world’s biggest media companies and the prime minister of Norway.
One single picture, shared on Facebook, started a debate about freedom of speech and the future of the media.
The photograph was the famous image, taken in 1972, of a screaming girl running from a napalm attack in Vietnam. The Pulitzer Prize-winning photo taken by photographer Nick Ut was shared by a Norwegian newspaper, initially as part of a routine piece, detailing the world’s most important photographs.
But when it was posted on Facebook, it was deleted. The company said that it was forced to take it down because it showed a picture of a naked child, and the context didn’t matter.
The site said that it was “difficult to create a distinction between allowing a photograph of a nude child in one instance and not others”. That was a distinction that irritated the news organisation that posted it and its supporters – many of whom argued that it got to the heart of a central problem with Facebook’s dominance in the media. Many said that the distinction was obvious – and that any company that wanted to become the kind of news source and media empire that Facebook does needed to be able to make it.
Deleting the post showed that Facebook cared less for history than for the strict adherence to its guidelines, many said.
“What they do by removing images of this kind, whatever good intentions, is to edit our common history,” said Norwegian prime minister Erna Solberg.
She became one of the many people who shared the photo in solidarity with the Norwegian author who had posted it. She was followed by several members of the Norwegian government who posted it themselves, and Ms Solberg re-posted the image with the girl blacked out after it had been removed from Facebook.
Ms Holberg said that she wanted to protest Facebook’s removal of the picture because of the damage that it could do to reporting and potentially to our understanding of history itself. “Today, pictures are such an important element in making an impression, that if you edit past events or people, you change history and you change reality,” she wrote.
It was far from the first time that Facebook’s editing – which is done by both humans and algorithms, mostly working together – has caused controversy. The site is by far the most viewed news and information site in the world, and so the choices it makes are scrutinised – and often cause outrage.
Earlier this year, for instance, it was claimed that the news panel that sits alongside Facebook’s news feed was favouring articles from liberal sources, and excluding conservative news. The humans that were editing that panel were being told not to feature right-wing outlets, reports claimed, despite the appearance that the news was picked fairly and based on what people were talking about.
Facebook then fired the human editors that had been choosing what went in that panel, and switched instead to algorithms that watched posts for what words were being mentioned a lot. Then other problems came – since the algorithm couldn’t distinguish between real and fake news, or whether anything was newsworthy at all, it started pushing out bizarre and entirely false stories to the over a billion people who use it.
But this time the problem appeared to be nothing to do with news – it was removed entirely from the site, not from the news panel, and was done so under the rules that govern everything posted on the site. That is presumably a result of the way that things are moderated on Facebook, which uses a combination of artificial intelligence and user reports to find problem material and then lets its workers decide whether something should be banned.
That was the system that banned the picture of a girl’s skin being burnt off by napalm because it depicted a “nude child”, and it was the system that according to Facebook’s statement couldn’t distinguish between that picture and anything more pedestrian.
"We try to find the right balance between enabling people to express themselves while maintaining a safe and respectful experience for our global community," Facebook's statement said. "Our solutions won't always be perfect, but we will continue to try to improve our policies and the ways in which we apply them."
Those guidelines began life as the rules governing what was just one of many social media sites, and are presumably written without any flex so that the huge array of people who sit looking through potentially infringing content know what can and can’t be hosted on Facebook’s platform.
But the site is gradually becoming the most powerful force in the media – deciding whether a news story or piece of information is seen by almost everyone with an internet connection, or nobody with one. As such, the scrutiny is becoming something like that government – fitting for a site with a population bigger than Norway, and almost every other country on Earth.
Join our commenting forum
Join thought-provoking conversations, follow other Independent readers and see their replies
Comments