Does Google Have A Racial Bias?

img_default

Get Notified
About New
Articles

null

3 MIN
November 28, 2020

Does Google Have A Racial Bias?

The Short Answer: No.Google is interested in searcher intent. Because of this, searches that are race-oriented, will show you what Google thinks you might be looking for, not just the words in your search.

Google Answer 1

 

Larry Page, Google’s CEO could have been speaking about his vision for Google’s search engine when he described the perfect search engine as “understanding exactly what you mean and giving you exactly what you want.” 

Google’s wants to understand exactly what we mean when we perform a search. In other words, when I type in the term “apple,” am I referring to a fruit or the biggest electric goods company in the world? It’s Google’s job to figure that out.

In Google’s most recent updates, the search engine announced it was using an Algorithm called BERT in 100% of its searches. BERT helps Google understand the search intent for longer-tail keywords in the form of questions or phrases. This update reflects Google’s emphasis on understanding searcher intent - what is the intent behind the words.

Searcher Intent Results Vs. Defining Results

Let’s do a comparison between Google and something that would focus on ore defining results - a stock image website for example. When you search for “happy white women” on a stock photo website, it's job is to say, "Happy White Women? I know what that looks like. Here's all of the images I have of happy white women."

Now let's say you make the same search on Google. Google's Algorithm, Bert, is going to say "Happy White Women? I've had _______ many people make searches that include that phrase, and they've ended up to be looking for one of these things. Here's all those things!"

This is heavily affected by volume! Let's say "happy black women" is a much more often used term than "happy white women." If that's the case, the images, articles, and videos that show up for these two searches are going to be very different!

You can see these vast differences yourself with a practical exercise:

  1. Search for images of "happy black women”. 
  2. Search for “happy Asian women”. 
  3. Finally, search for “happy white women”. You’ll notice a big difference in the results!

racial difference google search results blog image for Ability Growth Partners

 

The photos are not what you expected, but that doesn't mean the search is "wrong" or trying to influence you to think differently about race or gender.  It's consistent with Google's algorithm for every search. It shows people what they think they want, not what they actually typed. 
Similarly, if you typed in a Google search for "that movie about droids".  Google will almost certainly show you pictures of Star Wars, not pictures about a movie called "Droids".   This is because Google doesn't just translate your words into pictures.  It tries to translate your words into an expression of what you want.  For some reason, Google thinks this is what people want to see when they type in "Happy White Women".

 

Bottom line: Google's agenda is to give its users what they want. That's why it shows results based on searcher intent.