Google says it is struggling to 'understand truth' because people are confusing its search algorithm
'It is difficult for us to sort out which rank, A or B, is higher,' says Eric Schmidt
Your support helps us to tell the story
From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.
At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.
The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.
Your support makes all the difference.Google has admitted it is having trouble working out what’s true and what’s false.
People are managing to confuse the company’s search algorithm, says Eric Schmidt, the executive chairman of Alphabet.
As a result, it’s struggling to rank search results correctly, in order of accuracy.
“Let's say that this group believes Fact A and this group believes Fact B, and you passionately disagree with each other and you are all publishing and writing about it and so forth and so on,” Mr Schmidt said at the Halifax International Security Forum last weekend, reports CNBC.
“It is very difficult for us to understand truth.
“So when it gets to a contest of Group A versus Group B — you can imagine what I am talking about — it is difficult for us to sort out which rank, A or B, is higher.”
He added, however, that the algorithm can be tweaked if Google’s computer scientists believe it “is not doing a good enough job” of ranking content that has been manipulated.
“As a computer scientist, I can tell you, this stuff can be detected,” he said.
Mr Schmidt also linked the issue of truth to the so-called “filter bubble” effect, which has been blamed largely on Facebook.
The social network was fiercely criticised in the wake of Brexit and Donald Trump’s election, for supposedly shielding users from views they might disagree with, which may have resulted in them having a skewed outlook of people’s beliefs.
“That is a core problem of humans that they tend to learn from each other and their friends are like them,” he said.
“And so until we decide collectively that occasionally somebody not like you should be inserted into your database, which is sort of a social values thing, I think we are going to have this problem.”
Join our commenting forum
Join thought-provoking conversations, follow other Independent readers and see their replies
Comments