Google confirmation bias

If modern politics seems partisan and opinionated, blame the Internet.

Read time 4min 30sec

Human beings are psychologically programmed to make some very basic mistakes in how they process information. These mistakes are known as “cognitive bias”.

We deliberately set up our information sources to feed into our biases.

Ivo Vegter, ITWeb contributor

Some forms of cognitive bias are beneficial because they adapt our thinking in ways that enable us to take shortcuts and reach conclusions quickly. We evolve so-called “heuristics”, allowing us to exploit rules of thumb, intuitive or educated guessing, or what is vaguely known as common sense.

Other forms of cognitive bias lead to mistaken conclusions, however. One such bias is what behavioural economists and social scientists call “confirmation bias”.

This occurs when we unconsciously seek out information that supports our preconceived ideas, and avoid information that would undermine it. We may attach more weight to a fact that confirms our beliefs, and less to facts that call them into question, even though we have no objective grounds for doing so.

For example, if you believe that minibus taxis are a major contributor to South Africa's road death toll, you are likely to notice news reports of taxi accidents. Each such report will confirm your preconceived belief that they are a menace. The fact is, however, that minibus taxis are the safest form of road transport in the country. More than a dozen people die in passenger cars for every taxi-related death. Adjusted for how many people take cars and taxis respectively, and rounding conservatively to compensate for uncertainties in the data, a rational analysis shows that commuting by taxi is at least three times safer than using an ordinary passenger car.

A similar phenomenon occurs with almost any view we hold. If you think Microsoft software is insecure, you'll notice reports of viruses and bugs, while downplaying reports of vulnerabilities in your preferred brand of operating system. If you think bandwidth prices are too high, you'll likely gloss over reports of price decreases, and emphasise evidence that they're cheaper elsewhere. Those who think the rich are getting richer and the poor are getting poorer will often be blind to statistical evidence that the poor are getting richer, and continue to cite income inequality as a socio-economic problem.

We used to feed our prejudices merely by which newspapers and magazines we chose to read. Some preferred their analysis leaning left, while others won't be caught dead reading anything other than hard-right opinion.

The vast wealth of information at our fingertips today, via the Internet, ought to make it easier to defeat the confirmation biases that cause us to reach false conclusions. We're no longer limited to just what we read in either the Mail & Guardian or the Citizen (and seldom both). But ironically, the Internet has only made matters worse.

Not because information to correct our misconceptions isn't out there. Not even because we tend to suffer from confirmation bias that causes us to be selective about what we notice. But because we deliberately set up our information sources to feed into our biases.

Consider Google Search. It advertises as a useful feature the ability to remember your Web history, and present you with personalised search results. It knows, for example, when you search for “plane”, whether you're likely to mean an aircraft, a workshop tool, or a mathematical construct, and it will present you with search results appropriate to your interests. The unintended consequence, however, is that if you search for a topic of some political, religious, or economic controversy, Google will likely tell you only what you want to hear. It reinforces your confirmation bias before you're even able to make the cognitive error yourself.

Or consider your use of newsreaders or social networks. Chances are you're more likely to read writers with whom you agree than you are to follow those who get up your nose. By filtering your news through the lenses of those whose opinions you respect, you surrender yourself to automated confirmation bias.

As a journalist who researches a wide array of topics in order to write opinion pieces on them, I have turned off Google's “Web history” feature. Not because I'm afraid someone might discover my disturbing tastes or dubious associations, or because personalisation is never useful. I did so because if Google “personalises” my search results, I'll only get information that supports my views and none that opposes it. Likewise, I make a point of reading at least some social network feeds from people or organisations with whom I disagree.

It is hard enough to get a fair and balanced view on the world without having your research tools automatically pander to your biases without your knowledge or consent.

If partisan political views seem ever-more entrenched, while information that could resolve our differences has never been easier to find, it is because we've created automatic filters that magnify our biases.

Personalisation perpetuates polarisation. Or, put more bluntly, Google makes us stupid.

See also