Zuckerberg must tolerate the deepfake treatment – and it’s all thanks to Facebook’s shoddy approach to news

Under the watchful gaze of the global community, each minute that the video stays up serves as a test for the social media company

Kuba Shand-Baptiste
Wednesday 12 June 2019 16:24 BST
Comments
'Deepfake' Mark Zuckerberg video shared across the internet

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

There’s delicious irony in all of this. Weeks after (once again) dismissing the dangers of doctored political footage on Facebook, its founder and CEO unwittingly landed the starring role in a misleading video of his own.

Well, not technically his own, it’s part of an art installation called Spectre that he had nothing to do with. But that’s exactly the point of the video and other “deepfakes” like it – to make us believe that what we’re watching came straight from the horse’s mouth, as opposed to complex computer programming, or convincing video editing.

To bring you up to speed, towards the end of last month House speaker Nancy Pelosi became the target of an attack on her credibility. Using footage from a speaking engagement at the Centre for American Progress, the altered video convincingly made it seem as if, instead of speaking in the clear and concise manner that we’re used to, Pelosi was having trouble speaking altogether, slurring and repeating her words as if under the influence.

It gained at least 1.4 million views on Facebook, was tweeted, predictably, by Donald Trump, reported on in its doctored form by – again, no surprises here – Fox News, and generally spread far enough to give some people the impression that there was truth in it.

Still, Facebook (and Twitter, which has not yet removed Trump’s tweet) opted to keep it up, claiming that although fact-checkers had concluded that the video was “false”, they “don’t have a policy that stipulates that the information you post on Facebook must be true”, according to The Washington Post. Although Facebook did claim to reduce the video’s ranking, it did not go as far as actually removing it. YouTube, however, did.

This video wasn’t necessarily created with the same technology as the Zuckerberg example, but the intention is exactly the same: to discredit or humiliate public figures at will – and/or to convince us to believe that what we’re watching is rooted in fact.

Which brings me to Zuckerberg. He has spent much of the last few years fending off intense international scrutiny over fake news. It was almost inevitable that he’d find himself in this position.

In his video, Zuckerberg’s speech has been manipulated to look like he’s endorsing the Spectre project. It’s not malicious like Pelosi’s – but it raises the same issues. Instead of attempting to mislead people, the stated purpose of these videos (which include Kim Kardashian among other “influencers”), is to prise apart “the ‘black box’ of the digital influence industry and reveal to others what it is really like”.

Bill Posters, one of the creators of Spectre, expanded on that thought as part of the launch of the project. “The fact that citizens’ data – including intimate knowledge on political leanings, sexuality, psychological traits and personality – are made available to the highest bidder shows that the digital influence industry and its associated architectures pose a risk not only to individual human rights but to our democracies at large.”

Under the watchful gaze of the global community, each minute that the Zuckerberg video stays up serves as a test for the social media company: will it or won’t it apply its lax approach when it comes to this misleading content?

At the time of writing, the Zuckerberg video is still on Instagram, which is owned by Facebook. And I suspect it will stay up in order to avoid the sure-fire backlash that would ensue if it brazenly leaned into hypocrisy.

But what’s really interesting is how this proves exactly the point people have been making for years when it comes to the dangers of deepfakes, or any kind of attempt to distort facts on social media. No one is safe. Not even the man with perhaps the most power to act. Maybe he now realises what his company was doing to the rest of the world.

Facebook’s approach to fakery has been consistently troubled. It hooked up with fact-checking website Snopes.com at the end of 2016 as the pressure mounted, but that relationship soured and was cancelled in February. Brooke Binkowski, a former managing editor at Snopes, has said she became frustrated by Facebook’s “inaction” over hate speech and misinformation during the Rohingya crisis in Myanmar.

“I was bringing up Myanmar over and over and over … They were absolutely resistant,” she said.

She is not the only one to raise concerns over the effectiveness of these partnerships, especially given the company’s unwillingness to share data about how helpful they might have been in the battle against fake news.

Support free-thinking journalism and attend Independent events

Zuckerberg has been placed in the limelight at an intriguing time. Only days ago congress announced a probe into the unchecked power of Facebook and other social media giants, while the company is continually linked with distortions to campaigns around Brexit in the UK. Facebook has also just launched its Study app, which pays users for extensive access to their data,

The CEO’s sudden personal immersion in the debate has been a long time coming. Facebook may not want to take the bait now, but this certainly won’t be the last time something like this happens, and next time it may not come from such a benign source. If it wasn’t already clear that we – all of us, even the seemingly untouchable tech billionaires – are fair game in the fake news and data stealing wars, it certainly is now.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in