/
partner with:
Psychology

Our own choices generate biases for subsequent decisions

Humans like to think of their judgments as ‘rational’, solely based on objective information. Instead, we have found that people interpret decision-relevant information in a way that is distorted by their previous judgments. This mechanism can account for many important real-life biases, and it may be a natural consequence of the architecture of the brain.

Credits: Pixabay - CC0
by Bharath Chandra Talluri | PhD student

Bharath Chandra Talluri is PhD student at Department of Neurophysiology and Pathophysiology, University Medical Center Hamburg- Eppendorf, Hamburg, Germany.

Bharath Chandra Talluri is also an author of the original article

, Anne E. Urai | Postdoctoral Research Fellow

Anne E. Urai is Postdoctoral Research Fellow at Cold Spring Harbor Laboratory, New York, USA.

Anne E. Urai is also an author of the original article

, Tobias H. Donner | Professor

Tobias H. Donner is Professor at University Medical Center Hamburg-Eppendorf, Hamburg, Germany.

Tobias H. Donner is also an author of the original article

Edited by

Massimo Caine

Founder and Director

Profile
Views 7118
Reading time 4 min
published on Apr 8, 2019

Human judgment and decision-making is strongly shaped by biases. Intriguingly, some of those biases result from the choices we have made in the past. Having committed a categorical judgment, we no longer interpret new information neutrally but are biased to confirm our initial judgment. You may have experienced this: After stating in a poll that you will vote for a political candidate, you feel confirmed by the candidate's good policy proposals, but ignore the bad ones.

This phenomenon, called confirmation bias, was first described in the 5th century BC. Confirmation bias is pervasive in daily life, affecting human judgment in numerous cases of critical significance, for example, medical diagnostics, job interviews, court cases, scientific hypothesis testing - or political choices. Even so, why our brains tend towards confirmation has long remained elusive.

We set out to illuminate this question, starting from recent insights in neuroscience. Forming a decision requires the coordinated action of thousands of neurons that are distributed across many different regions of the brain. As people commit to a categorical decision ("I will vote for this candidate") this network of neurons settles into a stable state, called "attractor". What's more, brain regions involved in decision formation send strong feedback signals to regions processing the incoming information. We reasoned that this feedback might account for confirmation bias: once the brain's "decision regions" enter an attractor state and an initial decision is made, the feedback will amplify the processing of new information that is consistent with the initial decision while suppressing processing of information that contradicts the initial decision. Intriguingly, this is similar to the way our brains filter information when we selectively process a particular aspect of the sensory input to accomplish our goals: say, focusing on the announcement in a train station amidst the distracting noise in order to gather information about the train we are waiting for. So, we hypothesized that confirmation bias results from the selective filtering of incoming information, just like goal-directed attention.

We developed a novel behavioural laboratory task that enabled us to test this idea. Volunteers watched a cloud of flickering dots on a computer screen. When a beep played, some of the dots moved in unison, like the wind blowing some snowflakes into one direction. We then asked them to categorize the motion of the dots (to the left or to the right of a reference?). This judgment was hard because only a few dots moved coherently. We then showed them a second stream of moving dots, which might move in a slightly different direction. Finally, volunteers estimated the average direction across the two streams as precisely as possible.

Indeed, the volunteers selectively filtered new information (directions of the second stream of dots) so as to confirm their initial categorical judgment: New motion directions consistent with the initially chosen category (say, left from the reference) contributed strongly to the final estimation of the total direction; directions inconsistent with that category had little influence. The commitment to the initial categorical choice seems to have induced a change in brain state that made people interpret new information in a self-confirmatory way.

In psychology, confirmation bias had previously been studied in more complex decisions, as entailed in, for example, abstract reasoning or economic choices. It was striking to observe signatures of confirmation bias in decisions about basic sensory stimuli (snowflakes dart over a computer screen) that did not carry any particular meaning to participants. This suggests that confirmation bias is deeply rooted in the brain's machinery for decision-making.

To test whether our insights generalize to confirmation biases in complex decisions, we asked a different group of volunteers to do another version of the task: rapidly average a stream of numbers flashed on the screen. Halfway through the stream, they had to categorize the mean, by saying whether it was lower or higher than 50; at the end of the whole number stream, they typed in what they thought was the average across both. Again, they tended to up-weight numbers consistent with their chosen category (say, > 50) and down-weight numbers inconsistent with their choice.

Unravelling the sources of confirmation will help understand, and potentially expand, the bounds of human rationality. We have shown that the brain actively generates confirmation biases through a process similar to the filtering of input through selective attention. Our insights imply an intricate, hitherto unknown link between decision-making and selective attention, and they set the stage for pinpointing the mechanism at the level of the underlying brain circuits.

Original Article:
B. C. Talluri, A. E. Urai, K. Tsetsos, M. Usher, T. H. Donner, Confirmation Bias through Selective Overweighting of Choice-Consistent Evidence. Curr Biol 28, 3128-3135 e3128 (2018)

Edited by:

Massimo Caine , Founder and Director

We thought you might like