Saturday, January 16, 2016

Bias #1: Confirmation Bias


"Beliefs can survive potent logical or empirical challenges. They can survive and even be bolstered by evidence that most uncommitted observers would agree logically demands some weakening of such beliefs. They can even survive the total destruction of their original evidential bases."

THE INTRODUCTION

io9:
We love to agree with people who agree with us. It's why we only visit websites that express our political opinions, and why we mostly hang around people who hold similar views and tastes. We tend to be put off by individuals, groups, and news sources that make us feel uncomfortable or insecure about our views — what the behavioral psychologist B. F. Skinner called cognitive dissonance. It's this preferential mode of behavior that leads to the confirmation bias — the often unconscious act of referencing only those perspectives that fuel our pre-existing views, while at the same time ignoring or dismissing opinions — no matter how valid — that threaten our world view. And paradoxically, the internet has only made this tendency even worse.

There are also controversies as to whether some of these biases count as useless, irrational or whether they result in useful attitudes or behavior. For example, when getting to know others, people tend to ask leading questions which seem biased towards confirming their assumptions about the person. This kind of confirmation bias has been argued to be an example of social skill: a way to establish a connection with the other person.

Wikipedia:
Confirmation bias, also called confirmatory bias or myside bias, is the tendency to search for, interpret, favor, and recall information in a way that confirms one's preexisting beliefs or hypotheses, while giving disproportionately less consideration to alternative possibilities. It is a type of cognitive bias and a systematic error of inductive reasoning. People display this bias when they gather or remember information selectively, or when they interpret it in a biased way. The effect is stronger for emotionally charged issues and for deeply entrenched beliefs. People also tend to interpret ambiguous evidence as supporting their existing position. Biased search, interpretation and memory have been invoked to explain attitude polarization (when a disagreement becomes more extreme even though the different parties are exposed to the same evidence), belief perseverance (when beliefs persist after the evidence for them is shown to be false), the irrational primacy effect (a greater reliance on information encountered early in a series) and illusory correlation (when people falsely perceive an association between two events or situations).

Myside bias can cause an inability to effectively and logically evaluate the opposite side of an argument. Studies have stated that myside bias is an absence of "active open-mindedness," meaning the active search for why an initial idea may be wrong. Typically, myside bias is operationalized in empirical studies as the quantity of evidence used in support of their side in comparison to the opposite side. People demonstrate sizable myside bias when discussing their opinions on controversial topics. Memory recall and construction of experiences undergo revision in relation to corresponding emotional states. Emotional memories are reconstructed by current emotional states.

It may also reflect selective recall, in that people may have a sense that two events are correlated because it is easier to recall times when they happened together. Psychological theories differ in their predictions about selective recall. Schema theory predicts that information matching prior expectations will be more easily stored and recalled than information that does not match. Some alternative approaches say that surprising information stands out and so is memorable. Predictions from both these theories have been confirmed in different experimental contexts, with no theory winning outright.


History

Before psychological research on confirmation bias, the phenomenon had been observed anecdotally throughout history.

Beginning with the Greek historian Thucydides (c. 460 BC – c. 395 BC), who wrote of misguided treason in The Peloponnesian War: "...for it is a habit of mankind to entrust to careless hope what they long for, and to use sovereign reason to thrust aside what they do not fancy."

Italian poet Dante Alighieri (1265–1321), noted it in his famous work, the Divine Comedy, in which St. Thomas Aquinas cautions Dante upon meeting in Paradise, "opinion—hasty—often can incline to the wrong side, and then affection for one's own opinion binds, confines the mind."

English philosopher and scientist Francis Bacon (1561–1626), in the Novum Organum noted that biased assessment of evidence drove "all superstitions, whether in astrology, dreams, omens, divine judgments or the like". He wrote: "The human understanding when it has once adopted an opinion ... draws all things else to support and agree with it. And though there be a greater number and weight of instances to be found on the other side, yet these it either neglects or despises, or else by some distinction sets aside or rejects[.]"

In his 1897 essay "What Is Art?", Russian novelist Leo Tolstoy wrote: "I know that most men—not only those considered clever, but even those who are very clever, and capable of understanding most difficult scientific, mathematical, or philosophic problems—can very seldom discern even the simplest and most obvious truth if it be such as to oblige them to admit the falsity of conclusions they have formed, perhaps with much difficulty—conclusions of which they are proud, which they have taught to others, and on which they have built their lives."


THE EVIDENCE

A series of experiments in the 1960s suggested that people are biased toward confirming their existing beliefs. Later work re-interpreted these results as a tendency to test ideas in a one-sided way, focusing on one possibility and ignoring alternatives. In certain situations, this tendency can bias people's conclusions. Explanations for the observed biases include wishful thinking and the limited human capacity to process information. Another explanation is that people show confirmation bias because they are weighing up the costs of being wrong, rather than investigating in a neutral, scientific way.

Confirmation biases contribute to overconfidence in personal beliefs and can maintain or strengthen beliefs in the face of contrary evidence. Poor decisions due to these biases have been found in political and organizational contexts:

In physical and mental health

Biased assimilation is a factor in the modern appeal of alternative medicine, whose proponents are swayed by positive anecdotal evidence but treat scientific evidence hyper-critically. Cognitive therapy was developed by Aaron T. Beck in the early 1960s and has become a popular approach. According to Beck, biased information processing is a factor in depression. His approach teaches people to treat evidence impartially, rather than selectively reinforcing negative outlooks. Phobias and hypochondria have also been shown to involve confirmation bias for threatening information.

One factor in the appeal of alleged psychic readings is that listeners apply a confirmation bias which fits the psychic's statements to their own lives. By making a large number of ambiguous statements in each sitting, the psychic gives the client more opportunities to find a match. This is one of the techniques of cold reading, with which a psychic can deliver a subjectively impressive reading without any prior information about the client.

In law and politics

Biased interpretation is not restricted to emotionally significant topics. In an experiment, participants were told a story about a theft. They had to rate the evidential importance of statements arguing either for or against a particular character being responsible. When they hypothesized that character's guilt, they rated statements supporting that hypothesis as more important than conflicting statements. Nickerson argues that reasoning in judicial and political contexts is sometimes subconsciously biased, favoring conclusions that judges, juries or governments have already committed to. Since the evidence in a jury trial can be complex, and jurors often reach decisions about the verdict early on, it is reasonable to expect an attitude polarization effect. The "backfire effect" is a name for the finding that, given evidence against their beliefs, people can reject the evidence and believe even more strongly. The phrase was first coined by Brendan Nyhan and Jason Reifler.

Confirmation bias can be a factor in creating or extending conflicts, from emotionally charged debates to wars: by interpreting the evidence in their favor, each opposing party can become overconfident that it is in the stronger position. On the other hand, confirmation bias can result in people ignoring or misinterpreting the signs of an imminent or incipient conflict.

A two-decade study of political pundits by Philip E. Tetlock found that, on the whole, their predictions were not much better than chance. Tetlock divided experts into "foxes" who maintained multiple hypotheses, and "hedgehogs" who were more dogmatic. In general, the hedgehogs were much less accurate. Tetlock blamed their failure on confirmation bias — specifically, their inability to make use of new information that contradicted their existing theories.

In self-image

Social psychologists have identified two tendencies in the way people seek or interpret information about themselves. Self-verification is the drive to reinforce the existing self-image and self-enhancement is the drive to seek positive feedback. Both are served by confirmation biases. In experiments where people are given feedback that conflicts with their self-image, they are less likely to attend to it or remember it than when given self-verifying feedback. They reduce the impact of such information by interpreting it as unreliable. Similar experiments have found a preference for positive feedback, and the people who give it, over negative feedback.

In science

A distinguishing feature of scientific thinking is the search for falsifying as well as confirming evidence. However, many times in the history of science, scientists have resisted new discoveries by selectively interpreting or ignoring unfavorable data. In the context of scientific research, confirmation biases can sustain theories or research programs in the face of inadequate or even contradictory evidence. An experimenter's confirmation bias can potentially affect which data are reported. Data that conflict with the experimenter's expectations may be more readily discarded as unreliable, producing the so-called file drawer effect. Illusory correlation is the tendency to see non-existent correlations in a set of data.

Confirmation bias may thus be especially harmful to objective evaluations regarding nonconforming results since biased individuals may regard opposing evidence to be weak in principle and give little serious thought to revising their beliefs. Scientific innovators often meet with resistance from the scientific community, and research presenting controversial results frequently receives harsh peer review.


THE VERDICT

Cognitive explanations for confirmation bias are based on limitations in people's ability to handle complex tasks, and the shortcuts, called heuristics, that they use. For example, people may judge the reliability of evidence by using the availability heuristic — i.e., how readily a particular idea comes to mind. It is also possible that people can only focus on one thought at a time, so find it difficult to test alternative hypotheses in parallel. Another heuristic is the positive test strategy identified by Klayman and Ha, in which people test a hypothesis by examining cases where they expect a property or event to occur. This heuristic avoids the difficult or impossible task of working out how diagnostic each possible question will be. However, it is not universally reliable, so people can overlook challenges to their existing beliefs.

Motivational explanations involve an effect of desire on belief, sometimes called "wishful thinking". It is known that people prefer pleasant thoughts over unpleasant ones in a number of ways: this is called the "Pollyanna principle". Applied to arguments or sources of evidence, this could explain why desired conclusions are more likely to be believed true. According to experiments that manipulate the desirability of the conclusion, people demand a high standard of evidence for unpalatable ideas and a low standard for preferred ideas. This effect, known as "disconfirmation bias", has been supported by other experiments.

In other words, they ask, "Can I believe this?" for some suggestions and, "Must I believe this?" for others. Although consistency is a desirable feature of attitudes, an excessive drive for consistency is another potential source of bias because it may prevent people from neutrally evaluating new, surprising information. Social psychologist Ziva Kunda combines the cognitive and motivational theories, arguing that motivation creates the bias, but cognitive factors determine the size of the effect.

A study has found individual differences in myside bias. This study investigates individual differences that are acquired through learning in a cultural context and are mutable. The researcher found important individual difference in argumentation. Studies have suggested that individual differences such as deductive reasoning ability, ability to overcome belief bias, epistemological understanding, and thinking disposition are significant predictors of the reasoning and generating arguments, counterarguments, and rebuttals.

Psychologists Jennifer Lerner and Philip Tetlock distinguish two different kinds of thinking process. Exploratory thought neutrally considers multiple points of view and tries to anticipate all possible objections to a particular position, while confirmatory thought seeks to justify a specific point of view. Lerner and Tetlock say that when people expect to justify their position to others whose views they already know, they will tend to adopt a similar position to those people, and then use confirmatory thought to bolster their own credibility. However, if the external parties are overly aggressive or critical, people will disengage from thought altogether, and simply assert their personal opinions without justification. Lerner and Tetlock say that people only push themselves to think critically and logically when they know in advance they will need to explain themselves to others who are well-informed, genuinely interested in the truth, and whose views they don't already know. Because those conditions rarely exist, they argue, most people are using confirmatory thought most of the time.

No comments: