Making images of paedophilia harder to find on Google is no bad thing, but it doesn’t tackle the abuse itself

Real action to protect children is about more than search algorithms

Ellen E. Jones
Monday 18 November 2013 20:22 GMT
Comments
(Getty Images)

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

Here are two things that you don’t want to think about on any morning: child sex abuse images on the internet, and the prospect of teenagers becoming sexually active before they are mature enough to deal with the consequences. And here are two obliging solutions which mean you’ll no longer have to: Google and Microsoft have agreed to block images of paedophilia from appearing in response to more than 100,000 search terms, and politicians from across the spectrum have swiftly rejected Professor John Ashton’s suggestion that the legal age of consent be lowered to 15.

The blocking of search terms has been hailed by the Daily Mail as “a stunning victory” and by the Prime Minister as “a really significant step forward”. We are encouraged to view the move as the triumph of moral, responsible adults over the amoral internet service providers who, according to Cameron, “argued that it was against the very principle of the internet and search engines to block material”. Certainly any freedom of speech arguments for leaving search terms unregulated look ridiculous set against the horror of the cases of Mark Bridger and Stuart Hazell, who last year were shown to have searched for child abuse images prior to committing murder. Such crimes are unspeakable, and that’s the beginning of our problem with legislating against them.

The real choice for the Government was not between blocking search terms and doing nothing; it was between a step which merely appears decisive and a policy which will result in real, effective action against the causes of child abuse. While the PM, Microsoft and Google are busy patting themselves on the back, there is time to point out the limitations of the move. It will not prevent child abuse, it will not remove the images of the abuse from the internet, it will not even prevent their circulation, since, as has been pointed out, this will likely continue in the way it always did, via the peer-to-peer networks known as “the dark net”. Where we have unquestionably succeeded, however, is in obscuring the abuse of children from view, and this is not an unequivocal victory.

Too much policy in the area of under 16s and sex seems shaped to spare adults from revulsion rather than save children from abuse. We are so disturbed by the spectre of paedophilia that we avoid confronting it head on as a social problem, while all the time fiddling around the edges of policy reform.

Even if you disagree, as I do, with Ashton’s suggestion that lowering the age of consent will ultimately help confused teenagers, the short shrift given to his call for a debate is yet another example of an unhelpful public queasiness regarding the issue. The confusion that Ashton describes regarding access to support and services for young people is absolutely real, and the fact that around a third of teenagers are sexually active before they turn 16 suggests that the legal age of consent plays little role in their decision-making. Meanwhile, the cultural influences which contribute to the sexualisation of children at a younger and younger age are many and varied. And so far there has been little concerted effort to tackle them.

Real action to protect children would mean confronting issues that are far more taxing than the search engine algorithms which apparently had the Government stumped. It would mean not only compulsory sex education in schools, but high-quality, internet-literate sex education of the kind already proposed by Warwickshire City Council and subsequently attacked as “crude” and “a complete misuse of taxpayers cash”. It would mean investment not only in law enforcement resources to track down and prosecute perpetrators of child abuse, but also in a care system which too often leaves children vulnerable to exploitation. Ultimately, it would also mean objective research into the psychology of abusers, research for which there is little appetite at present.

Those who celebrate this action from Microsoft and Google would like to see it as a first, faltering step on the road towards such meaningful action and hopefully it is. But we should also consider a more depressing possibility; that it is a policy-level continuation of the kind of out-of-sight, out-of-mind response to the reality of child sex abuse that has long been the default for even the most well-meaning of adults. Historically, in schools, in churches and elsewhere the compulsion to deny and suppress discussion has again and again proven stronger than the instinct to protect children.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in