Pornography viewers could be forced to scan their faces before watching videos under new UK rules
Other systems such as warnings or asking people their age will not work, regulator says
Your support helps us to tell the story
This election is still a dead heat, according to most polls. In a fight with such wafer-thin margins, we need reporters on the ground talking to the people Trump and Harris are courting. Your support allows us to keep sending journalists to the story.
The Independent is trusted by 27 million Americans from across the entire political spectrum every month. Unlike many other quality news outlets, we choose not to lock you out of our reporting and analysis with paywalls. But quality journalism must still be paid for.
Help us keep bring these critical stories to light. Your support makes all the difference.
Viewers of pornography could soon be forced to scan their faces to ensure they are old enough to view adult videos.
The proposal comes from UK online safety regulator Ofcom, which has published its guidance on how intends to enforce the Online Safety Act. That law requires that adult websites ensure they are only viewed by over 18s, amid fears over the damage that pornography could be causing to children.
The draft guidance says websites must use methods which are technically accurate, robust, reliable and fair to carry out age checks, and recommends firms consider options such as open banking – where a user consents to their bank sharing information confirming they are over 18.
Ofcom also suggests other methods which could be used, such as photo ID matching where an uploaded document such as a passport is compared with an image taken at that moment; verified facial age estimation technology; mobile network age checks which automatically block age-restricted websites if the operator knows the user is under 18; credit card checks or digital identity wallets where a user’s proof of age is stored digitally and can be shared with the online pornography service.
However, the regulator said certain approaches would not meet its new standards, including self-declaration of age, online payments methods which do not require a person to be 18, such as a debit card, or general terms, disclaimers or warnings about content.
Under the Online Safety Act, platforms which do not comply with the new laws will face enforcement action, including possible fines.
“Pornography is too readily accessible to children online, and the new online safety laws are clear that must change,” Ofcom chief executive Dame Melanie Dawes said.
“Our practical guidance sets out a range of methods for highly effective age checks. We’re clear that weaker methods – such as allowing users to self-declare their age – won’t meet this standard.
“Regardless of their approach, we expect all services to offer robust protection to children from stumbling across pornography, and also to take care that privacy rights and freedoms for adults to access legal content are safeguarded.”
Ofcom said it would continue to work with online pornography services to finalise the draft guidance before a final version is published in early 2025, from which the Government will bring the duties set out in it into force.
Technology Secretary Michelle Donelan said: “Pornography can have an absolutely devastating impact on children and their view of healthy relationships.
“Right now, 13 is the average age at which a child first encounters it online. This is exactly why I made protecting children from pornography a key objective of our Online Safety Act.
“Companies must now work closely with Ofcom to ensure they have robust checks in place to stop children from seeing harmful content that they can never unsee.
“Consulting on how platforms must meet their new duties is key to making sure companies know exactly what is expected of them, allowing us to press ahead with this new online safety regime and the vital protections for our children that come with it.”
Additional reporting by agencies