What is a deepfake and why does the government want to make them illegal?
Deepfake images and videos have become increasingly prevalent with celebrities such as Taylor Swift and Cathy Newman falling victim
Your support helps us to tell the story
From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.
At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.
The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.
Your support makes all the difference.The creation of sexually explicit “deepfake” images is to be made a criminal offence under new legislation, the Ministry of Justice has announced.
Individuals who create sexually explicit deepfakes without consent could face an unlimited fine or jail time under the new laws.
The creation of a deepfake will be an offence irrespective of whether the creator intended to share it or not.
Deepfake images have become more prevalent in recent years, with images being viewed millions of times a month across the world.
The fake images and videos are made to look hyper-realistic with the victim usually unaware and unable to give their consent to being sexualised in such a way.
The new offence will be introduced through an amendment to the controversial Criminal Justice Bill, which is still making its way through parliament.
Prime minister Rishi Sunak said the government will crackdown on “vile degenerates” who create “distressing and abusive” deepfakes, and the announcement has been welcomed by campaigners and politicians.
What is a deepfake image?
A deepfake is a combination word of “deep learning” and “fake” and is an image that has been digitally manipulated to replace one person's likeness convincingly with that of another.
Deepfakes have increasingly become more realistic due to the use of Artificial Intelligence (AI) and can also be computer-generated images of human subjects that do not exist in real life.
Although deepfakes are not always sexually explicit, they garnered widespread attention for their potential use in creating child sexual abuse material, celebrity pornographic videos and revenge porn.
While researchers found one deepfake pornography video online in 2016, some 143,733 new deepfake pornography videos were uploaded to the 40 most used deepfake pornography sites in the first three quarters of last year – with this amounting to more than in all the previous years added up.
Deepfakes have also been used to spread misinformation and fake news, as well as fraudsters using deepfake audio for financial scams and bullying.
How does the technology work?
To generate convincing and realistic deepfake media, most current technologies require large amounts of genuine data such as images, footage or sound recordings.
Complex algorithms are used to mimic the target individual’s appearance, voice, or behaviour as closely as possible.
Deepfakes can have positive applications in entertainment, education, medicine and other fields, particularly for modelling and predicting behaviour, and have been used increasingly. However, the possibilities for abuse are growing as distribution platforms become more accessible and the technology becomes cheaper.
Prominent examples
More than 250 British celebrities have been victims of deepfake porn, according to an investigation published last month.
Among them is news presenter, Cathy Newman, who said she felt violated on watching digitally altered footage in which her face was superimposed on to pornography using AI.
Channel 4 aired its investigation in March and said it did an analysis of the five most visited deepfake websites and found 255 of the almost 4,000 famous individuals listed were British, with all but two being women.
Channel 4 News said it contacted more than 40 celebrities for the investigation, all of whom were unwilling to comment publicly.
The broadcaster also said it found that more than 70% of visitors arrived at deepfake websites using search engines like Google.
Earlier in the year, deepfake images of pop star Taylor Swift were posted to X, formerly Twitter, and the platform blocked searches linked to the singer after fans lobbied the Elon Musk-owned platform to take action.
Industry experts have warned of the danger posed by AI-generated deepfakes and their potential to spread misinformation, particularly in a year that will see major elections in many countries, including the UK and the US.
How have campaigners and politicians reacted to the announcement?
Minister for Victims and Safeguarding Laura Farris said the creation of deepfake sexual images is “unacceptable irrespective of whether the image is shared”.
“It is another example of ways in which certain people seek to degrade and dehumanise others – especially women,” she said.
“And it has the capacity to cause catastrophic consequences if the material is shared more widely. This Government will not tolerate it.
“This new offence sends a crystal clear message that making this material is immoral, often misogynistic, and a crime.”
Deborah Joseph, European editorial director of Glamour, said: “In a recent Glamour survey, we found 91% of our readers believe deepfake technology poses a threat to the safety of women, and from hearing personal stories from victims, we also know how serious the impact can be.
“While this is an important first step, there is still a long way to go before women will truly feel safe from this horrendous activity.”
Yvette Cooper, Labour’s shadow home secretary, welcomed the announcement too.
“Superimposing somebody’s image onto sexually explicit photos and videos is a gross violation of their autonomy and privacy, which can cause enormous harm, and it must not be tolerated,” she said.
Ms Cooper said it was “vital” the Government gets ahead of “fast-changing threats”.
She added: “It’s essential that the police and prosecutors are equipped with the training and tools required to rigorously enforce these laws in order to stop perpetrators from acting with impunity”.
Join our commenting forum
Join thought-provoking conversations, follow other Independent readers and see their replies
Comments