Forming beliefs in a world of filter bubbles
Listen to this article
Why do so many Republicans still believe that the recent US presidential election was fraudulent? Is it possible to reach coronavirus deniers with factual arguments? A study provides insights into what it is that stops people from changing their minds. Their findings have been published in the journal Proceedings of the Royal Society of London B.
By talking to other people and observing their behavior, we can learn new things, acquire new skills, and adapt to changing conditions. But what if the information provided by the social environment is inconsistent or contradictory?
The researchers investigated how people deal with information from diverse social sources, and how they use that information to form beliefs.
What the researchers say: “The internet, in particular, has dramatically changed the structure and dynamics of social interactions. The availability of social sources is to some extent controlled by algorithms—what we see is biased in favor of our own preferences. At the same time, the internet gives us access to potentially conflicting views,” said the lead author.
The researchers first conducted an experimental study with 95 participants from the United States. Participants completed an adapted version of the Berlin Estimate Adjustment Task (BEAST), which reliably measures individuals’ use of social information. They were shown images of groups of animals and asked to estimate the number of animals. They were then shown the estimates of three other participants and asked to make a second estimate. The more participants adjusted their estimates to those of their peers, the more account they had taken of social information.
Across 30 rounds of the task, the researchers varied the conditions of the study, presenting participants with estimates that deviated to a greater or lesser extent from their own estimate, and that were more or less extreme.
The results showed that whether participants integrated information from the social environment in their second estimate depended on whether and how strongly their peers’ estimates deviated from each other and from their own estimate. Participants were most likely to adjust their estimates when their peers were in close agreement with each other and their estimates were not too different from the participant’s own. Higher variation in peers’ estimates reduced their impact on the participant’s own judgment.
In general, participants gave more weight to their own initial estimate than to their peers’ estimates. Overall, three adjustment strategies were identified: (1) sticking to one’s original estimate, (2) adopting the estimate of one of the three peers, or (3) compromising between one’s original estimate and the peer estimates.
The relative frequency of these strategies differed significantly between study conditions. When participants observed a single peer who closely agreed with them, they were more likely to stick to their original estimate or to adopt the estimate of the near peer. When none of the peers were in close agreement with them, participants were more likely to compromise by adjusting their estimate towards, but rarely beyond, that of the nearest peer.
“Our experiment quantifies how people weigh their own prior beliefs and the beliefs of others. In our context, there is actually no reason to assume that one’s own estimate is better than anyone else’s. But what we see here is an effect known in psychology as ‘egocentric discounting’ - namely that people put more weight on their own beliefs than on those of others,” explains the study co-author. “What’s more, our study reveals that this weighting is strongly impacted by the consistency of others’ beliefs with one’s own: people are more likely to heed information that confirms their own beliefs.”
Building on these findings, the researchers developed a model that integrates the observed adjustment strategies and captures that people pay particular attention to social information that confirms their personal judgements.
Using simulations, they then investigated how people would behave in real-life situations. For example, they simulated a typical filter bubble, where social information tends to come from like-minded people. They also simulated typical attempts to change people’s minds by confronting them with information inconsistent with their own beliefs. Finally, they investigated how people react to being simultaneously exposed to different groups with extreme beliefs. Their simulations suggest that confirmation effects can lead to divergent social information being ignored, filter bubble effects being exacerbated, and people becoming more extreme in their attitudes.
“Although our study was experimental in design, our model helps explain many contemporary phenomena. It shows how the way people process social information can exacerbate filter bubbles on the internet, and why public debates often become polarized as people quickly become impervious to opposing arguments. As interactions increasingly often take place online, people can often find information that confirms their existing beliefs, making them less willing to listen to alternatives,” said the researchers.
So, what? This study confirms what has been known from past research—namely that people are not persuaded by facts but rather by relationships and emotion. The strong emotions here are fear of damage to our sense of self—which is why we look only for information that confirms our beliefs—and fear of exclusion and the need to belong.
This need to be part of a supportive group is probably the strongest human need and almost every decision we make is in some way affected by it. We alter our opinions and our assumptions in order to belong. This is the core dynamic behind a cult.
Join the discussion
More from this issue of TR
Why experiences are better gifts for older children
All I want for Christmas is ... a gift that provides enduring happiness. New research compares the level of happiness children derive from material goods with the level of happiness they derive from experiences.
Forming beliefs in a world of filter bubbles
By talking to other people and observing their behavior, we can learn new things, acquire new skills, and adapt to changing conditions. But what if the information provided by the social environment is inconsistent or contradictory? The internet, in particular, has dramatically changed the structure and dynamics of social interactions.
The reward system and decision-making
Scientists have, for the first time, recorded real-time changes in dopamine and serotonin levels in the human brain that are involved with perception and decision-making. Dopamine is the main component in all mammalian reward systems and is also critical to learning and memory.
You might be interested inBack to Today's Research
Nudging does not necessarily improve decisions
Nudging, the concept of influencing people’s behavior without imposing rules, bans or coercion, is an idea that government officials and marketing specialists alike are keen to harness, and it’s often viewed as a one-size-fits-all solution. Now, a study puts things into perspective:Whether a nudge really does improve decisions depends on a person’s underlying decision-making process.
'Playing hard to get' really works
To the human neurogenetic system, romantic relationships are just like all other relationships, including those with clients.
Join our tribe
Subscribe to Dr. Bob Murray’s Today’s Research, a free weekly roundup of the latest research in a wide range of scientific disciplines. Explore leadership, strategy, culture, business and social trends, and executive health.