menu

Computer model explains the spread of misinformation

January 16, 2022

Listen to this article

Computer model explains the spread of misinformation

It starts with a superspreader, and winds its way through a network of interactions, eventually leaving no one untouched. Those who have been exposed previously may only experience mild effects.

No, it’s not COVID-19. It’s the contagious spread of misinformation and disinformation— misinformation that’s fully intended to deceive.

Now researchers have come up with a computer model that remarkably mirrors the way misinformation spreads in real life. The work might provide insight on how to protect people from the current contagion of misinformation that threatens public health and the health of democracy, the researchers claim.

What the researchers say: “Our society has been grappling with widespread beliefs in conspiracies, increasing political polarization, and distrust in scientific findings,” said the lead author of the study, published in the journal Public Library of Science ONE (PLOS One). “This model could help us get a handle on how misinformation and conspiracy theories are spread, to help come up with strategies to counter them.”

Scientists who study the dissemination of information often take a page from epidemiologists, modeling the spread of false beliefs on how a disease spreads through a social network. Most of those models, however, treat the people in the networks as all equally taking in any new belief passed on to them by contacts.

These researchers instead based their model on the notion that our pre-existing beliefs can strongly influence whether we accept new information. Many people reject factual information supported by evidence if it takes them too far from what they already believe. Health-care workers have commented on the strength of this effect, observing that some patients dying from COVID cling to the belief that COVID does not exist.

To account for this in their model, the researchers assigned a “belief” to each individual in the artificial social network. To do this, the researchers represented beliefs of the individuals in the computer model by a number from 0 to 6, with 0 representing strong disbelief and 6 representing strong belief. The numbers could represent the spectrum of beliefs on any issue.

For example, one might think of the number 0 representing the strong disbelief that COVID vaccines help and are safe, while the number 6 might be the strong belief that COVID vaccines are in fact safe and effective.

The model then creates an extensive network of virtual individuals, as well as virtual institutional sources that originate much of the information that cascades through the network. In real life those could be news media, churches, governments, and social media influencers—basically the super-spreaders of information.

The model starts with an institutional source injecting the information into the network. If an individual receives information that is close to their beliefs—for example, a 5 compared to their current 6—they have a higher probability of updating that belief to a 5. If the incoming information differs greatly from their current beliefs—say a 2 compared to a 6—they will likely reject it completely and hold on to their 6 level belief. Psychologists call this “confirmation bias.”

Other factors, such as the proportion of their contacts that send them the information (basically, peer pressure) or the level of trust in the source, can influence how individuals update their beliefs. A population-wide network model of these interactions then provides an active view of the propagation and staying power of misinformation.

While the current model suggests that beliefs can change only incrementally, other scenarios could be modeled that cause a larger shift in beliefs—for example, a jump from 3 to 6 that could occur when a dramatic event happens to an influencer and they plead with their followers to change their minds.

“It’s becoming all too clear that simply broadcasting factual information may not be enough to make an impact on public mindset, particularly among those who are locked into a belief system that is not fact-based,” the researchers concluded. “Our initial effort to incorporate that insight into our models of the mechanics of misinformation spread in society may teach us how to bring the public conversation back to facts and evidence.”

So, what? This is a great study and an interesting attempt to map out the spread of dangerous stuff. It also reinforces what many, many studies have shown—that humans do not change their minds on the basis of facts or reason. We change beliefs primarily in order to gain or increase relational support or to belong.

This stands to reason since we are primarily social animals whose only defense in our natural environment was the support of people around us. We take on the beliefs of a cult, or a sect or a political party in order to gain the support which membership brings (which might well be illusional) and belonging. We will try and convince other people outside the cult of the rightness of the cult’s beliefs—even if we don’t think they’re true.

For example, take the 2020 election. Most Republicans will say that it was stolen, and that DT won. This is part of the creed that allows them to be accepted by other Republicans and thus to belong to the cult. Contrary “facts” will be dismissed as lies because they threaten the existence of the cult to which they belong and where they find support. The “6” that they claim to believe about the stolen election is literally felt, unconsciously, to be a matter of survival. The election misinformation is espoused and spread to increase the strength of the cult and thus the sense of safety and belonging of its members.

One of the points that I make in the book on influencing and persuasion that I am writing is: to persuade someone of something, first get them committed to the relationship with you. We are all about relationships, facts matter little to us as a species (bonobos and chimpanzees are much more fact and reason orientated, as a number of studies have shown).

This is why we are so easy to manipulate.

Dr Bob Murray

Bob Murray, MBA, PhD (Clinical Psychology), is an internationally recognised expert in strategy, leadership, influencing, human motivation and behavioural change.

Join the discussion

Join our tribe

Subscribe to Dr. Bob Murray’s Today’s Research, a free weekly roundup of the latest research in a wide range of scientific disciplines. Explore leadership, strategy, culture, business and social trends, and executive health.

Thank you for subscribing.
Oops! Something went wrong while submitting the form. Check your details and try again.