Bias and reason

From an article speaking of confirmation bias:

In 2009, a study at Ohio State showed that we will spend 36 percent more time reading an essay if it aligns with our opinions

But a preference to spend more time reading things you agree with is either not a bias, or some biases are reasonable. It is almost tautological to say that, ceteris paribus, you spend more time with what you prefer, and so if bias means nothing more than preference then bias is both desirable and unavoidable. But presumably biases are irrational or distorting preferences, i.e. preferences that occlude the truth.

Confirmation bias really happens, and is easy enough to prove with experiments on even small groups of persons, like Peter Wason’s 2,4,6 experiment. But while it’s true that we spend more time looking for confirming evidence than disconfirming evidence, it does not follow that a person with perfect objectivity would spend an equal amount of time looking for both, and just what the appropriate time-ratio would be is probably a ridiculous question to raise.

But there is a deeper problem: calling something confirmation bias is just the first move in trying to out what a reasonable approach to evidence should look like. True, in Wason’s experiment almost everybody will fail to find the rule Wason wants them to find, but it doesn’t follow that a more reasonable person would be just as zealous to disconfirm an initial idea as to confirm it. For all I know, the sort of person that does well in the Wason experiment might make a worthless scholar; and my suspicion is that something like this might just be the case. It may not be reasonable to be just as open to doubt as to confirmation in our initial experiences with things, still less that we should prefer doubt to confirmation (Teleb, for example, argues that we should, and Descartes does the same thing with his methodological doubt).  Perhaps doubt can only be effective on a basis that can only be established by a preference for confirmation.

Cognitive research isn’t my field, but I’ve tried to read widely in the popular lit. One concern I have is with a short term bias that can’t yet raise the question of what a long-term view of what a reasonable person would do in the face of evidence. This is tied to a deeper question of what exactly counts as a reasonable response, and getting some short-term answers systematically wrong might well be compatible with this.

1 Comment

  1. skholiast said,

    December 11, 2014 at 8:57 pm

    Responding to clinical research with personal anecdotal evidence is doubtless silly, by some definitions. But my impression is that I tend to read far more material with which i disagree with, than otherwise. Very often when I sense that I agree with a piece of writing, i put it aside; I already know what I’ll find there. Sometimes reading something with which I disagree changes my mind, sometimes it just makes me think — even think about how it’s wrong. But I get a lot more traction out of work I disagree with.

    However, that said, I tend to re-read work I agree with far more often.


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: