Fake porn videos featuring celebrities deleted from the internet in attempt to stop 'deepfake' footage

Very convincing videos can be made with just a simple piece of software

Andrew Griffin
Thursday 01 February 2018 16:08 GMT
Comments
Many of the videos are being shared on Reddit
Many of the videos are being shared on Reddit (REUTERS/Robert Galbraith)

Your support helps us to tell the story

This election is still a dead heat, according to most polls. In a fight with such wafer-thin margins, we need reporters on the ground talking to the people Trump and Harris are courting. Your support allows us to keep sending journalists to the story.

The Independent is trusted by 27 million Americans from across the entire political spectrum every month. Unlike many other quality news outlets, we choose not to lock you out of our reporting and analysis with paywalls. But quality journalism must still be paid for.

Help us keep bring these critical stories to light. Your support makes all the difference.

Fake porn videos claiming to show celebrities are being deleted from the internet.

The footage – which has become infamous in recent weeks – can be made relatively simply, using just a simple application and some artificial intelligence. But they produce entirely convincing videos that are almost indistinguishable from real ones, allowing celebrities to be easily photoshopped into other footage.

In large part, that technology is being used to put celebrities' faces into adult videos, allowing people to claim that the footage shows famous people making pornographic films.

Now Gfycat, the San Francisco tech company that has hosted many of the videos, has said the posts are "objectionable" and that it will be removing them from the internet. "Our terms of service allow us to remove content that we find objectionable. We are actively removing this content," it said.

Much of the footage is made, uploaded to Gfycat and then shared on Reddit. Much of it is still online on that site, though it is expected to be removed soon.

Reddit hasn't yet commented on the phenomenon. But its rules ban "involuntary pornography", a stipulation originally used to keep so-called revenge porn off the site, though it requires the person involved to complain.

The legality of such fake videos is still a matter for debate. Some have been taken down on copyright grounds but it's not clear that simply using a picture of a famous person allows them to have a video removed for that reason.

Videos posted online show just how convincing many of those swaps can be.

They are made using one simple tool known as FakeApp, whose developer claims it has been downloaded more than 100,000 times. That app gives easy access to artificial intelligence tools that can spot a person's face and swap it out for another – allowing one person to be convincingly swapped into an entirely different scene.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in