Google's search results aren't as unbiased as you think - and a lack of diversity could be the cause
The algorithms that decide Google's search results are designed by humans - and a lack of staff diversity could be making the service worse
Your support helps us to tell the story
From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.
At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.
The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.
Your support makes all the difference.Google is sometimes regarded as the 'perfect' search engine - it's certainly the most popular one on the web, mostly for the way it easily and accurately indexes the world's knowledge in a straightforward, unbiased way.
However, the algorithms and processes that make up Google's search results are designed and built by humans - who have their own prejudices, experiences and biases.
This issue was discussed in a February TEDx talk by Swedish journalist and lecturer Andreas Ekström, titled 'The myth of the unbiased search result'.
In the talk, Ekström mentions two separate incidents of 'Googlebombing' - the practice of manipulating Google's algorithms to get a piece of content to the top of the search results for a given topic, usually with an agenda in mind.
The first he mentions occurred shortly after the election of Barack Obama in 2009. A group started a racist campaign to try and get a distorted image of First Lady Michelle Obama which made her look like a monkey to the top of the image search result for 'Michelle Obama'.
By captioning, tagging and changing the file name of this picture to 'Michelle Obama' and publishing it on a number of blogs and social media sites, the group tricked Google's algorithm into thinking this was an popular and accurate image, pushing it to the top.
As a result, for a few weeks in 2009, this racist image appeared at the top of the Google search results for 'Michelle Obama'.
The campaign then ended and the picture would have drifted away from the top spot over time, but Google made the decision to step in, changing their settings and manually removing the inaccurate and racist image from their results.
A couple of years later, a similar thing happened in the wake of an attack in Norway in which far-right terrorist Anders Behring Breivik murdered 77 people in Oslo and on the island of Utøya in separate bomb and gun attacks.
It was the deadliest attack that had taken place in Norway since World War Two, and quickly people began to notice Breivik had left a carefully-laid digital trail - in blog posts and emailed manifestos that laid out his racist and Islamophobic views.
In response to Breivik's efforts to spread his message online, a Swedish developer and search engine expert named Nikke Lindqvist launched a campaign, urging people to upload pictures of dog dirt to blogs and social media and tag them with Breivik's name.
Like the Obama campaign, it worked - and in the weeks following the attack, those Googling Breivik were confronted with hundreds of pictures of dog dirt.
Although different in motivation, the two campaigns worked in exactly the same way - but in the second, Google didn't step in, and the inaccurate Breivik images stayed at the top of the search results for much longer.
Few would argue that Google was wrong to end the Obama campaign or let the Breivik one run its course, but the two incidents shed light on the fact that behind such a large and faceless multi-billion dollar tech company as Google, there's people deciding what we see when we search.
And in a time when Google has such a poor record for gender and ethnic diversity and other companies struggle to address this imbalance (as IBM did when they attempted to get women into tech by encouraging them to 'Hack a Hairdryer'), this fact becomes more pressing.
Out of all of Google's technical staff worldwide, only 18 per cent are women. When looking at ethnicities, two per cent of tech staff are hispanic, and one per cent are black.
In light of Ekström's talk, many have questioned whether Google's practice of drawing the people who build and design its algorithms from a small, homogeneous pool of people could lead to unperceived biases, simply because the range of experiences and points of view at Google are narrower than they could be.
Speaking to The Independent, Ekström said: "It seems very obvious to anyone in tech I've spoken to that a diverse group will be more suited to sustainably solving a problem than a group of clones."
"We've come a long way in a short time when it comes to this issue, and I think we've got to the point where nobody really questions anymore the idea that your background and your experience is so profound that you will approach your professional challenges differently depending on your experience of life so far."
Google always fixes its controversial search results, such as an incident with Google Maps that took place this year, in which a search for a racist epithet took users to the White House.
Google's algorithm curates what its creators think we want to see - by increasing diversity amongst its staff, Google could more accurately provide its users with what they want, and possibly stop manipulation like this from happening in the first place.
This isn't neccessarily Google's fault, however - it needs to hire the best people for the job, and those studying IT-related courses at top universities are largely a fairly homogeneous group.
"Who goes to the most prestigious colleges? Who is deciding to major in what subject 15 or 20 years ago? It's going to take some time, but Google's made some incredible progress in this regard in the last decade," Ekström said.
In his view, fixing this problem needs a serious long-term approach from the company's HR department, but also a greater understanding of the difference between experience and skill.
"If we decide that your programming abilities are the most important thing then great, but what other qualities do you need? What other skillsets are required?" he says.
In his view, Google needs to ask what things they need to add to their experience.
"When you do that, it's easier next time around to say 'We're going to recruit a woman,' or 'We're going to recruit a person of colour', because the group we have at the moment is fairly similar and we need more points of view."
"When they look at a group that is lacking a certain skillset, Google looks to strengthen the group by bringing in different experiences and making it more diverse," he said.
Google, and every other tech company, is already taking steps to do this in an effort to help them serve their users better.
Google is the world's most-used search engine for a reason, but it's not completely free from bias. Those from different backgrounds bring the company different skills and opinions, and while it and other companies have a long way to go to increase diversity, making their workforce less homogeneous could help them in the long run.
Join our commenting forum
Join thought-provoking conversations, follow other Independent readers and see their replies
Comments