By Charles Taylor
10 Jul 2019

SEO Mythbusting: Does Google Have a Political Bias?

Charles-Taylor-SEO-Bias-Twitter

SEO is a field where testing assumptions can reveal truths about optimizing your website. Busting SEO myths helps us make better business decisions and get more from investments in SEO.

The last couple of years have been an interesting time for search and politics. Not separately, mind you — I am referring to the politics of search. On several occasions, Google has been accused of political bias — most recently and energetically by President Trump — and Google has even spoken before Congress on the subject.

I realize any political topic will get touchy, but I could not think of a more fun subject to try to test empirically.

The idea of explicit (or unintentional) biases in search results are concerning for many reasons; with so many people using search every day, the results they encounter can have huge political, social, cultural, or financial influence. The topic of bias in search has long garnered public interest, but in 2018 it reached an all-time high in web searches, according to data from Google Trends.

Donald Trump Tweet: Stop the Bias

It’s not much of a secret that Eric Schmidt, former Executive Chairman of Google and (at that time) Alphabet, both endorsed former President Obama and had a close relationship with his White House and administration. Additionally, it‘s known that Mr. Schmidt was an avid supporter of Hillary Clinton’s campaign.

Of course, there is nothing inherently illegal or unethical about this, nor should there be; people are allowed to have and support their political beliefs. I can, however, see why folks on the other side of the political aisle may be concerned.

Add to this the fact that it was discovered employees of Google had suggested they should adjust search results to favor specific websites in response to President Donald Trump's immigration travel ban, and you have the makings of a really great conspiracy theory.

While all of this political intrigue may be fun to discuss and argue about, the discussion really boils down to one thing: is there a political bias coded into Google’s algorithm?

The more I thought about testing this, the more I realized that it could become an enormous undertaking. While I initially mapped out over 25 separate tests I could run, I determined that I needed to focus and test individual variables.

I decided to first test if the algorithm has a general bias against President Trump, and then, if it prefers his opponents over him.

Testing for Political Bias in Google's Algorithm: The Process

In the first test, I wanted to see if Google would promote or punish a webpage based on mentions of a specific person. I first wanted to determine if the algorithm contains a political bias, or if it treats everyone equally.

To run this test, I created seven pages with exactly the same content — titles and meta descriptions need to be unique or else Google will not index the page. I then came up with a unique fake keyword that displays zero results on Google when searched.

Lastly, on all the pages, I needed a control name. I created a fake president name, “John Q. Smith,” and placed it in the title, meta description, and four times within the body copy. As one more note, I decided to mix up usage of the name. Instead of just typing “President John Q. Smith,” I used variations such as “President Smith,” “Mr. Smith,” and “John Smith.”

After the pages were launched, indexed, and ranking for the fake keyword, I noted which page was ranking in the #4 position out of the seven pages for the fake keyword. After about a week, the rankings usually settle down and a specific page typically ranks consistently in the 4th position. Once this occurred, I edited the page ranking 4th to change all the “John Q. Smith” references to “Donald J. Trump” — and then I waited.

If Google results are biased against the President, then the page should drop in the rankings.

Within several days, the page went to the #1 position for the fake keyword. Did this mean Google is biased towards President Trump? I found this equally unlikely, so I created two more variations of this test to confirm.

For each of these variations, I followed the same procedure. Each set had the same content, except for the title and meta description. Each set also had its own fake keyword, and I used “John Q. Smith” as the control name placeholder. For the first two sets, I decided to test against the well-known Republican personalities Rush Limbaugh and Sean Hannity.

Once all the pages were indexed, I waited for the 4th position ranking to settle. After it settled, I updated those pages by adding the new variable names in each set — this took about a week. Both the Rush Limbaugh and Sean Hannity variable pages moved to #1.

Did I just uncover a right-wing conspiracy buried within Google’s algorithm? While I found this even more unlikely than a vast left-wing conspiracy, I decided to create another two test variations.

I set them up just like the last set, but this time I used the names Barack Obama and Hillary Clinton as my variables. Just like before, after indexing, updating, and waiting, both the updated variable pages jumped from position #4 to position #1 for their respective fake keywords.

The interesting thing about testing is that I often learn unexpected things about the search engine. In this case, Google is favoring my updated variable pages. To me, this indicates that Google favors “freshness,” “uniqueness,” or both. Any real person would agree that these test pages could hardly be described as fresh or unique, but to a computer, they are fresher and more unique than the control pages. I have seen this in past tests and actually expected to see this result, but decided to run them anyways to establish a baseline.

So, test set #1 did not expose bias against Donald Trump, nor any of my test subjects for that matter. Therefore, from a larger context, Google is not automatically demoting content about our President.

The next argument could be that Google may be biased against, or biased in favor of, specific people. For example, it would rank former president Barack Obama over current president Donald Trump. I had my next test.

I set up the second test very similarly to the previous test set: seven identical pages, except for titles and meta descriptions, and a fake keyword on each. This time, instead of a fake placeholder control name, I used Barack Obama as my control.

After the pages were indexed and the rankings settled, I took the page ranking in the 4th position and replaced Mr. Obama’s name with Mr. Trump’s name. After about a week, the page with Donald Trump’s name ranked #1. I then performed the exact same test, but in reverse: Donald Trump’s name was the control, and Barack Obama’s name was placed on the page ranking in the 4th position. Again, the updated variable page, with Mr. Obama’s name, jumped to the 1st position for the fake keyword.

I wanted to be extra clear, so I decided to run two more tests. I called these #2C and #2D, to help keep all these tests straight. This time, I ran the exact same tests as already described, but instead of using former president Barack Obama’s name first as the control and then as the variable, I used Hillary Clinton’s name instead.

Once again, the page with the variable name ultimately ranked 1st for the fake keyword. Therefore, the Donald Trump variable beat the control pages — the Hillary pages — and the Hillary Clinton variable pages beat the Donald Trump control pages. I was definitely seeing a pattern here.

After running nine separate tests (five using test #1, and four using test #2) the page that was most fresh and most unique always ranked in the first position, regardless of whose name I used.

The Findings

What, if anything, have we learned?

First, the results seem to confirm the validity of Google’s never-ending counsel that we build the best content possible. The base algorithm appears to be aimed at ranking the freshest and unique content available.

I spent six weeks trying to determine if Google is biased, and I learned that the adage “content is king” seems to be true. That is what I like about testing: sometimes I learn something profound, and sometimes you just reinforce the basics.

Secondly, I would suggest that the results demonstrate Google’s algorithm is likely not overtly biased against any person. While I will agree many employees of Google clearly have a political preference, so far I am happy to say that I see no indication of a political bias programmed into the algorithm. It could be biased in a more subtle way — which is why I am running further tests.

I am currently running tests on sentiment. The control pages contain either negative or positive sentiment, and the variable page is the other. I will test if there is indication that a page with positive sentiment towards President Trump will hurt or help the page in the rankings. Regardless of the results, I will run it against either Obama, Clinton, or both as well.

I suspect the results will be consistent with the tests I’ve already run — but you never know for certain!

Charles Taylor

Charles has been actively involved in online marketing since 2000. For the past 15 years, he's focused on SEO in a number of B2B and B2C verticals – legal services, eCommerce, information marketing and affiliate marketing. He is currently the SEO Manager for Verizon's Fios division. Charles is always looking for new ways to help new and established companies to solve their SEO challenges.