Google accidentally reveals names of rape victims despite law
The company says it will remove any similar examples in the future
Your support helps us to tell the story
From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.
At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.
The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.
Your support makes all the difference.Google has been inadvertently revealing the names of rape victims that should remain secret.
Searching for details of certain cases can bring up predictions that reveal details of those cases that are confidential. Victims in rape cases are supposed to remain confidential forever, even if the accused is found not guilty.
A number of examples of searches that revealed the names of accusers were seen by The Times. Google has admitted that the illegal information was shown, but said it would work to stop them showing in the future.
Google's autocomplete feature pops up whenever a user starts typing, and attempts to guess what the person is looking for so they can get there more quickly. It does so using an algorithm, by logging common and trending searches, and using information about location and previous searches.
But because that information is being pulled in from the internet, it can show irrelevant and sometimes shocking results. In the case of the rape results, the details appear to be pulled in from discussions on social media posts in which people are illegally naming accusers.
Google has systems in place that are intended to catch inappropriate predictions and avoiding showing them to users. But the huge number of searches going through the site – 15 per cent of which are new – make catching all of them difficult.
Google said that the autocomplete predictions highlighted by The Times were against its rules. It has removed the examples it has been alerted to and would remove any similar ones in the future, it said.
"We don't allow these kinds of autocomplete predictions or related searches that violate laws or our own policies and we have removed the examples we’ve been made aware of in this case," a Google spokesperson said. "We recently expanded our removals policy to cover predictions which disparage victims of violence and atrocities, and we encourage people to send us feedback about any sensitive or bad predictions."
It is far from the first time that Google's autocomplete feature has brought trouble to the company. Because it is based on searches and other data from the internet, it can occasionally show highly offensive predictions even for apparently innocent searches.
Google's rules for what shows on search results pages – as opposed to the autocomplete box that pops up to help users get to them – are enforced differently, and it is not thought that any of the problem searches were displayed when users actually clicked through to the results.
Join our commenting forum
Join thought-provoking conversations, follow other Independent readers and see their replies
Comments