The Confirmation Bias
Previously in one of my posts I referenced a task designed by Peter Wason in the 1960’s that showed participants 3 numbers and then asked them to find a rule that these 3 numbers followed. An excellent video can be found here, https://www.youtube.com/watch?v=vKA4w2O61Xo.
Wason’s experiment highlighted a prevalent bias that impacts us all, The Confirmation Bias. The Confirmation Bias is the tendency to search for, interpret, favour, and recall information in a way that confirms one's pre-existing beliefs or hypotheses (Plous, 1993). It works such that imagine you are putting together a jigsaw puzzle, you have in your head what the jig saw puzzle should look like and therefore you jam the puzzle pieces together, so they match your predetermined imagined image.
According to American Psychologist Raymond Nickerson, ‘if one were to attempt to identify a single problematic aspect of human reasoning that deserves attention above all others, the confirmation bias would have to be among the candidates for consideration’ (1998).
In a literature review conducted by Rassin (2008) it discusses how our perception is influenced by the Confirmation bias. For example, the expectation that a certain stimulus is present can lead us to see that stimulus, while, in fact, it is not present at all (an illusion). Also, ambiguous stimuli are perceived in a hypothesis-congruent way (Risinger et al, 2002). Lastly, confirmation bias can lead us to concentrate so strongly on confirming evidence that disconfirming and peripheral information is literally overlooked (Deffenbacher, 1980).
It has also been found that the Confirmation bias influences judgement and decision making. A study by Lord, Ross and Lepper (1979) concluded that if people cannot avoid disconfirming evidence, they tend to give less weight to such information than to the confirming information. In this study, 24 students in favour of the death penalty and 24 against it were exposed to a number of short research reports. The reports were fabricated for this study in a way that they were all of equal quality. Some of the reports produced results supporting the crime-reducing effect of death penalties, whereas others produced opposing findings. Participants were instructed to rate the quality of all research reports. Indeed, participants rated reports producing results in line with their personal opinion to be better than those describing conflicting findings. In addition, at the end of this study, participants were even more pronounced in their opinion compared with the beginning of the experiment. Hence, the confrontation with disconfirming evidence was not merely ineffective but even counterproductive.
Another manifestation of the confirmation bias dictates that it is very hard for people to change their opinion, even when faced with clear counter-evidence. Ross, Lepper and Hubbard (1975) asked 60 participants to differentiate between 25 actual suicide notes and 25 fabricated ones. Participants subsequently received false feedback about their accuracy. Some of them were told that they had performed well, while others were told they had performed badly. After a short delay, participants were informed that this feedback was false and that it was part of the experimental procedure. Participants were asked to verify that they understood that the feedback had in fact been false. Nonetheless, participants’ responses to subsequent questions indicated that the feedback still influenced their self-perceived accuracy as well as their self-efficacy as to future performances in similar assignments. In other words, the original feedback could not be undone by subsequent overriding information.
In understanding the origin of the confirmation bias there are many viewpoints. From a cognitive perspective it can be due to the reliance on heuristics (brain short cuts) such as the availability heuristic that leads to easy retrieval of confirming information. From a motivational perspective it can be the desire to have pleasant thoughts (confirming our beliefs) opposed to unpleasant thoughts (disputing our beliefs). From an evolutionary perspective it could be the result of weighing up the cost of accepting a false hypothesis opposed to rejecting a true hypothesis (eg. I can see my partner is selfish, but I see her as altruistic to preserve the relationship).
Looking for falsification of every perception, judgment and decision we make is time and resource consuming. Whilst it’s our best defence against the confirmation bias it is just not practical in all situations. An efficient but not always accurate process coined “positive test strategy” by Klayman & Ha (1987) reasons that a scientific test of a hypothesis is one that is expected to produce the most information. This means that a positive test or a falsification can be useful, just depends on the circumstances and what is going to yield the most data.
In the current information climate, the confirmation bias has never been so prevalent and powerful. As we transition to a personalised technological future where social programming employs algorithm to target users with deliberate information, we see that objectivity can quickly become eroded. As the amount of confirming evidence becomes overwhelming we have to consciously take time to look for falsifications as often a “no” is much more revealing than a yes”.
Deffenbacher, K A. (1980) Eyewitness Accuracy and Confidence - Can We Infer Anything About Their Relationship? Law and Human Behavior Vol. 4 (4): 243-260.
Klayman, Joshua; Ha, Young-Won (1987), "Confirmation, Disconfirmation and Information in Hypothesis Testing" (PDF), Psychological Review, American Psychological Association, 94 (2): 211–28, doi:10.1037/0033-295X.94.2.211, ISSN 0033-295X, retrieved 2009-08-14
Nickerson, Raymond S. (1998). Confirmation bias: A ubiquitous phenomenon in many guises, Review of General Psychology Vol. 2, (2): 175-220.
Plous, S. (1993). The Psychology of Judgment and Decision Making. p. 233
Rassin, E. (2008). Individual differences in the susceptibility to confirmation bias.Netherlands Journal of Psychology, 64(2), 87-93. http://dx.doi.org/10.1007/BF03076410
Risinger, D Michael; Saks, Michael J; Thompson, William C; Rosenthal, Robert. (2002) The Daubert/Kumho implications of observer effects in forensic science: Hidden problems of expectation and suggestion California Law Review; Berkeley Vol. 90 (1): 1-56.
Ross, L; Lepper, M R; Hubbard, M; (1975) Perseverance in self-perception and social perception: biased attributional processes in the debriefing paradigm. National Library of Medicine. Journal of personality and social psychology Vol. 32 (5): 880-892