Computer

A computer model seeks to explain the spread of misinformation and suggest countermeasures

illustration of nodes connected through a network -- lines and dots of different sizes and colors

It starts with a superspreader and winds through a web of interactions, ultimately leaving no one unscathed. Those previously exposed may experience little effect when exposed to a different variant.

No, it’s not a virus. It is the contagious spread of misinformation and disinformation – disinformation that is entirely meant to deceive.

Now, researchers at Tufts University have developed a computer model that remarkably mirrors how misinformation spreads in real life. The work could provide insight into how to protect people from the current contagion of misinformation that threatens public health and the health of democracy, the researchers say.

“Our society is grappling with widespread beliefs in conspiracies, growing political polarization and distrust of scientific discovery,” said Nicholas Rabb, who holds a doctorate. computer science student at the Tufts School of Engineering and lead author of the study, which appeared January 7 in the journal ONE Public Library of Science. “This model could help us understand how misinformation and conspiracy theories spread, to help find strategies to counter them.”

Scientists studying the spread of information often draw inspiration from epidemiologists, modeling the spread of false beliefs about how a disease spreads through a social network. Most of these models, however, treat network members as if they all equally accept any new beliefs passed on to them through contacts.

Instead, the Tufts researchers based their model on the notion that our pre-existing beliefs can strongly influence whether we accept new information. Many people dismiss factual information backed by evidence if it takes it too far from what they already believe. Healthcare workers have commented on the strength of this effect, observing that some patients dying from COVID cling to the belief that COVID does not exist.

To account for this in their model, the researchers assigned a “belief” to each individual in the artificial social network. To do this, the researchers represented individuals’ beliefs in the computer model as a number from 0 to 6, with 0 representing strong disbelief and 6 representing strong belief. The numbers could represent the spectrum of beliefs on any issue.

For example, one might think that the number 0 represents strong disbelief that COVID vaccines help and are safe, while the number 6 might be strong belief that COVID vaccines are in fact safe and effective.

The model then creates a vast network of virtual individuals, as well as virtual institutional sources which are at the origin of a large part of the information which passes through the network. In real life, it can be the news media, churches, governments, and social media influencers, basically the super newscasters.

The model starts from an institutional source which injects information into the network. If an individual receives information close to their beliefs, for example, a 5 compared to their current 6, they have a higher probability of updating that belief to a 5. If the incoming information differs significantly from their current beliefs, say a 2 versus a 6 – they will probably reject it completely and cling to their belief at level 6.

Other factors, such as the proportion of their contacts who send them the information (essentially, peer pressure) or the level of trust in the source, can influence how individuals update their beliefs. A population-scale network model of these interactions then provides an active view of the spread and persistence of misinformation.

Future refinements to the model will take into account new insights from both network science and psychology, as well as a comparison of model results with opinion surveys and real-world network structures at over time.

While the current model suggests that beliefs can only change gradually, other scenarios could be modeled that cause a larger change in beliefs – for example, a jump from 3 to 6 that could occur when a dramatic event happens. to an influencer and pleads with his followers to change their mind.

Over time, the computer model can become more complex to accurately reflect what is happening in the field, say the researchers, who in addition to Rabb include his educational adviser Lenore Cowen, a computer science professor; computer scientist Matthias Scheutz; and JP deRuiter, professor of psychology and computer science.

“It is becoming all too clear that the mere dissemination of factual information may not be enough to impact the mindset of the public, especially among those who are locked into a belief system that is not grounded in facts.” says Cowen. “Our initial effort to incorporate this idea into our models of mechanisms for spreading disinformation in society can teach us how to bring public conversation back to facts and evidence.”