Friday, October 19, 2012

"I knew it!" Confirmation Bias and Explanation

Source of all evil or defender of all freedom? How the same
event can seemingly justify directly opposing beliefs.

“Confirmation bias” refers to the well-known human foible of favouring their existing beliefs and commitments, even in the face of conflicting evidence. For instance, if you believe that some person – Annie, say – is a shifty person, you will have a tendency to hold to that belief over time, and even come to believe it more strongly. Psychology experiments suggest that confirmation bias works in a variety of ways – including biasing the search for evidence (we search for data that supports our belief, rather than data refuting it), the interpretation of evidence (we look for weaknesses and ambiguities in evidence that questions our belief) and the memory of prior evidence one has been exposed to (we have faster and more thorough recall of confirming rather than dis-confirming evidence). Through these three methods, human beings show a decided tendency to cement their initial beliefs rather than revising them. In some cases confirmation bias can be very powerful. If people are exposed to evidence that Annie is shifty, say, and then shown beyond all dispute that this initial evidence was fabricated, they will still tend to harbour suspicions about Annie’s character that arose on the basis of that evidence, even though they will explicitly acknowledge and believe that the evidence was false!

Of course, confirmation bias is not insurmountable. People can and do decide they were wrong about something; and can recognize and choose to use tests and lines of enquiry that will expose their mistaken beliefs.

Confirmation bias and philosophy

Does confirmation bias affect philosophers too? There are good reasons to believe it does – consider the old saying that “being a philosopher means never having to admit you’re wrong”. The worry expressed here is that philosophers will use the sizable intellectual tools at their disposal to critique and interrogate opposing theories and evidence, and search out subtle cases of confirming evidence. In so doing they will corroborate their initial position, rather than using those intellectual tools to honestly enquire into it. And certainly wholesale changes of philosophical theory by established philosophers tend to be pretty rare. To be sure, philosophers develop their positions over time, revising and responding to new evidence and argument, but direct moves into the opposing camp don’t happen a lot, in my experience at least. (And I have no particularly special virtues in this regard either, of course.)

And it is probably fair to say such biases arise in the emotionally charged milieu of political philosophy even more stridently. Those who think that capitalism is a pretty good idea sometimes seem to be able to find confirming evidence for this belief every day and in every way. And the same is true for those who think capitalism is the fundamental source of every misery in the world. No possible horror can beset humanity without an explanation leading back to capitalism.

Explanations of confirmation bias

So why do we all do it? Well, there are lots of different reasons that have been put forward for this human tendency to cement our beliefs over time. For instance, having to reject a belief is cognitive hard work. The type of thinking that would reject the belief can require effort. Furthermore, the type of thinking that follows from revising the belief takes still more effort – do other beliefs now have to shift because that first one has been rejected? And since human beings are largely prone to avoiding effort, we are motivated to avoid these situations of mental heavy-lifting. Easier to stick with what we know.

Also, the more we understand our world, the better we are doing, and the more secure we feel. Beliefs underwrite our actions and our projects in the world. If our beliefs can be relied upon, then that enhances our ability to predict future events and attain our goals. Finding out a prized belief is wrong threatens that happy security.

And there is a social factor. The more we change our mind and revise what we have said, the less others can feel confident in relying on us as a valuable source of information and insight. Since most of us like our views to be taken seriously, a habit of admitting defeat carries social costs.

There’s a lot to be said for these sorts of explanations of confirmation bias – and the countless others out there in the psychology literature. In all likelihood, confirmation bias is over-determined – there are lots of reasons we do it.

However, I want to reflect on the possibility that confirmation bias arises from largely rational, sensible ways of thinking – in particular the search for explanations for events.

Confirmation bias and seeking explanations

Imagine something strange happens in the world – something you can’t explain. But it is (just suppose) important for you to be able to understand it. So you seek an explanation for it.

What does that involve?

Well, one of the main things it will involve will be aligning this new event with your current beliefs. If you can work out how this new thing happened, given what you already believe, then you will have explained that event. If you can work out how the facts as you currently understand them would have caused (or at least allowed the possibility of) this new event, then you will have explained it. That, pretty much, is just what it means to have an explanation.

So when you start searching for information to explain the event, you search for what we might call linking facts; facts that would allow you to move from your beliefs as they stand to the occurrence of the event.

Current beliefs + Linking facts = Explanation of event (or phenomena)

This, I hope, is pretty straightforward. If we want to understand something, there’s no point aligning it with other people’s beliefs (how would that help?), and no point trying to explain it from first principles all the way up (couple of years to spare?). The new phenomena is understood and explained only when it makes sense, given what we already accept. This doesn’t mean that the linking facts can’t replace or force revision of existing beliefs, but it does mean the existing beliefs fundamentally frame what counts as an explanation.

In fact, this search for linking facts has at least two results.

First, the type of linking facts you are looking for will vary depending on what your current beliefs are. If you believe that the United States is ultimately the root of all international problems, for example, then you will search out the type of linking facts that will explain the current phenomena – the situation in Syria in 2012, say – with the US. You will look for the involvement of the CIA, the pressure the US exerts on the global media, its historical influence on and action in the Middle East, its current oil interests in that region, and so on and on. On the other hand, if you believe that most of the world’s problems arise from extreme ideologies and fundamentalist religious beliefs, then you will search for very different sorts of linking facts.

Second, the search for linking facts will determine when your investigation stops. Once you have located the required linking facts, then the event is understood and explained. You can stop searching. So once you have found – to return to the example – that the US has CIA agents at work on Syria’s border with Turkey, that it has a history of animosity with the Syrian regime, and that Syria is an ally of Russia, then you have explained the Syrian crisis and the way it is presented in the mainstream media.

Job done. Move on.

And, of course, if you had started out with concerns about religious extremism, then in all likelihood a different set of linking facts would have been discovered, and the search stopped at that point. In other words, you will not continue to search in such a way that you might, (a) find subsequent facts that disprove the existence of your linking facts or impact on their capacity to explain the event, or (b) find subsequent facts that would better account for the event, using an entirely different explanation.

Now this search for explanation is not in any sense irrational. But it can clearly contribute to confirmation bias. As well as constraining the nature and end-point of the search, it cements the initial belief even further.

Why?

Because now that initial belief (about the role of the US in world affairs, say) helps explain this new event. The fact that it can explain this means you now have one more reason for believing it. If someone else later challenges this belief of yours, you are entitled to think: “But wait, clearly the US is playing this role, because I found evidence of its presence in the Syrian crisis.” You did not set out to test this belief, but you nevertheless ultimately collected evidence that helped justify your continued belief in it. In this way consideration of the same event by two people with different starting beliefs, even with access to the same information, will contribute for each of them to their justification of their initial beliefs.

And that’s a problem, because it means that confirmation bias arises from what are otherwise quite sensible and effective methods for understanding and explaining events.

No comments: