I discussed confirmation bias in an earlier post here. In this post, I hope to bring Astrophysicist Carl Sagan and Comedian Stephen Colbert together and end with a Zen story.
Wikipedia defines Confirmation Bias as “Confirmation bias, also called myside bias, is the tendency to search for, interpret, favor, and recall information in a way that confirms one’s beliefs or hypotheses while giving disproportionately less attention to information that contradicts it.”
Confirmation bias can put brakes on your scientific thinking, and it is a daily struggle to avoid your biases.
The Colbert Report Edition:
I recently came across a study performed by LaMarre, Landreville and Beam from Ohio State University. In this study the authors investigated the biased message processing of political satire in the famous “The Colbert Report” TV show. For those who do not know this show, “The Colbert Report” show was a political satire show hosted by Stephen Colbert. Colbert refered to his fictional character as a “well-intentioned, poorly informed, high-status idiot”, and was a caricature of televised political pundits.
In the study, the researchers investigated the biased message processing of political satire in the show and the influence of political ideology on perceptions of Stephen Colbert. The researchers called his style of comedy as “ambiguous deadpan satire”. The following facts were revealed from the study.
- No significant difference existed between conservatives and liberals regarding Stephen Colbert being funny.
- Conservatives reported that Colbert only pretends to be funny, and genuinely meant what he said; supporting their conservative ideology. Liberals on the other hand reported that Colbert used satire and was not serious; supporting their liberal ideology.
In other words, both liberals and conservatives with extreme viewpoints watched the exact same show and came away with exactly opposite opinions. This is a classical case of confirmation bias!
Carl Sagan and the Fine Art of Baloney Detection:
Carl Sagan was a very famous American Astrophysicist and great scientific thinker. In his book, The Demon-Haunted World, Science as a Candle in the Dark, Carl Sagan provides us a thinking tool kit that will assist us in detecting baloney, as he puts it. Sagan refers to this as a means to construct and to understand, a reasoned argument and – especially important – to recognize a fallacious or fraudulent argument. The tools are as follows;
- Wherever possible there must be independent confirmation of the “facts.”
- Encourage substantive debate on the evidence by knowledgeable proponents of all points of view.
- Arguments from authority carry little weight — “authorities” have made mistakes in the past. They will do so again in the future. Perhaps a better way to say it is that in science there are no authorities; at most, there are experts.
- Spin more than one hypothesis. If there’s something to be explained, think of all the different ways in which it could be explained. Then think of tests by which you might systematically disprove each of the alternatives. What survives, the hypothesis that resists disproof in this Darwinian selection among “multiple working hypotheses,” has a much better chance of being the right answer than if you had simply run with the first idea that caught your fancy.
- Try not to get overly attached to a hypothesis just because it’s yours. It’s only a way station in the pursuit of knowledge. Ask yourself why you like the idea. Compare it fairly with the alternatives. See if you can find reasons for rejecting it. If you don’t, others will.
- If whatever it is you’re explaining has some measure, some numerical quantity attached to it, you’ll be much better able to discriminate among competing hypotheses. What is vague and qualitative is open to many explanations. Of course there are truths to be sought in the many qualitative issues we are obliged to confront, but finding them is more challenging.
- If there’s a chain of argument, every link in the chain must work (including the premise) — not just most of them.
- Occam’s Razor. This convenient rule-of-thumb urges us when faced with two hypotheses that explain the data equally well to choose the simpler.
- Always ask whether the hypothesis can be, at least in principle, falsified. Propositions that are untestable, unfalsifiable are not worth much. Consider the grand idea that our Universe and everything in it is just an elementary particle — an electron, say — in a much bigger Cosmos. But if we can never acquire information from outside our Universe, is not the idea incapable of disproof? You must be able to check assertions out. Inveterate skeptics must be given the chance to follow your reasoning, to duplicate your experiments and see if they get the same result.
Surprisingly, the list above is also applicable to detecting, and reducing confirmation bias.
A cup of Tea – a Zen story:
There once lived a great Zen master, Nan-in. Reputation of his wisdom spread, and a university professor decided to visit Nan-in to inquire about Zen.
The professor was welcomed into Nan-in’s room. Nan-in served the professor tea.
The professor’s cup was soon full and yet Nan-in kept on pouring tea causing the professor’s cup to overflow. Nan-in still kept on pouring.
“Master, please stop. The cup is full. There is no more room for more tea.”
“Like this cup,” Nan-in said, “your brain is full of your opinions and biases. There is no more room for Zen unless you first empty it”
I will finish off with a great piece of wisdom, I heard on Quora. Unfortunately, I do not know the source.
“My opinions are not me. My opinions are just pieces of data that I carry in a box with me. I can and should change them based on the information available. If I marry myself to my opinions, I will cling to them regardless of what the information says. If I want to be right, I need to be prepared to change my mind.”
Always keep on learning…
Photo credit – Paul H. Byerly