Confirmation bias is, in a nutshell, where you only seek out or pay attention to data that matches your preconceived biases. This renders the content of the data you’re reading somewhat irrelevant. You won’t be interested in it if it doesn’t match the opinions you currently hold about the subject.
It affects us all to some extent. We have a certain set of beliefs we’re happy with and we can find it unsettling to have those beliefs challenged. Sometimes we can completely disregard a logical and well-formed proof if it doesn’t match the version of the world we all keep in our heads. The ego often trumps reality.
It is particularly common in politics. If you listed various pairs of political policies and asked people to choose the one they prefer, you might get some considered selections. But if you labeled one column Labour and the other Tory, the selections would be far less considered. We support such biases with the newspapers we read: Tories read Tory newspapers and Labour supporters read Labour newspapers.
Politicians themselves are some of the worst offenders. They frequently oppose an opinion or idea simply because it’s coming from the other party but, if you think about it, the world isn’t really like that. You would expect the opposing party to come up with good ideas from time to time, even if just by pure luck. Yet you rarely hear one party gushing over another party’s idea, complimenting them on it in Parliament and saying they’ll include it in their own manifesto.
All good reasons to dislike party politics, perhaps. It’s a breeding ground for sectarianism.
There are ways to avoid confirmation bias. The medical profession uses blind studies where the researcher does not know who had the real treatment and who had the placebo. That’s one way to avoid bias but the equivalent in politics would be for the public to get a list of policies without being told which party they’re from, and I’m not sure that would be practical in our political environment.
One way we can avoid the faulty thinking that leads to confirmation bias is by understanding how we form our opinions. The Ladder of Inference might be useful here. By understanding where decision making goes wrong we can try to avoid its pitfalls in the future.
Another way to improve our decision making might be to use Bayes' Theory. This takes our biases into account at the beginning (called Priors) and modifies them based on data we pick up from then onwards, hopefully guiding us to a more quantified decision.
In reality, though, applying logic and some process to our decision making is a lot of effort and I suspect we rather like our opinionated views. Other animals have been shown to exhibit cognitive bias and we are after all just well-dressed apes. Maybe, though, if we're aware much of our decision making is based on preconceived biases, we can make an effort to read more stuff that challenges them.
I won't hold my breath, though.
- What Is Confirmation Bias? — Psychology Today.
- How Confirmation Bias Works — VeryWell Mind.
- Introduction to Bayesian Decision Theory — Rayhaan Rasheed via Medium.
- Use Bayes’ Rule to uncover (and clarify) your hidden Beliefs in everyday Life — Timo Böhm via Medium.