From an article speaking of confirmation bias:
In 2009, a study at Ohio State showed that we will spend 36 percent more time reading an essay if it aligns with our opinions
But a preference to spend more time reading things you agree with is either not a bias, or some biases are reasonable. It is almost tautological to say that, ceteris paribus, you spend more time with what you prefer, and so if bias means nothing more than preference then bias is both desirable and unavoidable. But presumably biases are irrational or distorting preferences, i.e. preferences that occlude the truth.
Confirmation bias really happens, and is easy enough to prove with experiments on even small groups of persons, like Peter Wason’s 2,4,6 experiment. But while it’s true that we spend more time looking for confirming evidence than disconfirming evidence, it does not follow that a person with perfect objectivity would spend an equal amount of time looking for both, and just what the appropriate time-ratio would be is probably a ridiculous question to raise.
But there is a deeper problem: calling something confirmation bias is just the first move in trying to out what a reasonable approach to evidence should look like. True, in Wason’s experiment almost everybody will fail to find the rule Wason wants them to find, but it doesn’t follow that a more reasonable person would be just as zealous to disconfirm an initial idea as to confirm it. For all I know, the sort of person that does well in the Wason experiment might make a worthless scholar; and my suspicion is that something like this might just be the case. It may not be reasonable to be just as open to doubt as to confirmation in our initial experiences with things, still less that we should prefer doubt to confirmation (Teleb, for example, argues that we should, and Descartes does the same thing with his methodological doubt). Perhaps doubt can only be effective on a basis that can only be established by a preference for confirmation.
Cognitive research isn’t my field, but I’ve tried to read widely in the popular lit. One concern I have is with a short term bias that can’t yet raise the question of what a long-term view of what a reasonable person would do in the face of evidence. This is tied to a deeper question of what exactly counts as a reasonable response, and getting some short-term answers systematically wrong might well be compatible with this.