Bad habits and societal failures – not bias – drive misinformation

One of the greatest challenges to modern democracies is the spread of misinformation. Yet conventional explanations of how it spreads and why may not be right.

In a new paper – Sharing of misinformation is habitual, not just lazy or biased – by Gizem Ceylan, Ian Anderson and Wendy Wood (PNAS 17 January 2023) argue that while misinformation is a worldwide concern carrying socioeconomic and political consequences we are not so sure about what drives its spread.

They say: “The answer lies in the reward structure on social media that encourages users to form habits of sharing news that engages others and attracts social recognition. Once users form these sharing habits, they respond automatically to recurring cues within the site and are relatively insensitive to the informational consequences of the news shares, whether the news is false or conflicts with their own political beliefs.”

The research draws on a sample of 2,476 which is substantial for social science research but not that huge given the scale of online activity.

Once these reward structures form habits then “information sharing is automatically activated by cues on the platform without users considering response outcomes such as spreading misinformation.”

The authors find that a result of user habits 30 to 40% of the false news spread was due to the 15% most habitual news sharers. Habitual users also shared information that challenge their own political beliefs suggesting that the echo chamber phenomenon is not a primary driver of misinformation.

What the authors overwhelmingly conclude is that “sharing of false news is part of a broader response pattern established by political beliefs. Finally, we show that sharing of false news is not an inevitable consequence of user habits: Social media sites could be restricted to build habits to share accurate information.”

Easy enough said but difficult when, whatever the protestations of social media platform owners, they profit from the very functions which make their platforms pernicious.  Clearly Elon Musk recognises this and that’s why he is walking back changes the previous regime instituted – despite sort of saying he isn’t.

A few of his own recent Tweets indicate his thinking. For instance: “Parents don’t realise the Soviet level of indoctrination that their children are receiving in elite high schools and colleges.” (Really?)

“The intent is for this site to be fair & impartial, favoring no party, seeking only the least wrong truth.” (What he means by the least wrong truth is a philosophical mystery.)

“Sorry for turning Twitter from nurturing paradise into place that has … trolls” (Does he own the joint or not?)

But how on earth he finds time to run Twitter and Tesla while Tweeting so often is a mystery until you think about how much money he is losing at present.

Another recent PNAS paper – Belief traps: Tackling the inertia of harmful beliefs – by Martin Scheffer, Denny Borsboom, Sander Nieuwenhuis and Frances Westley puts the socio-economic factors in misinformation back into play.

They argue that: “pathological beliefs can sustain psychiatric disorders; the belief that rhinoceros horn is an aphrodisiac and may drive a species extinct; beliefs about gender or race may fuel discrimination; and belief in conspiracy theories can undermine democracy.”

The casual observer of much Western politics might prefer more definitive words than ‘may’.

Nevertheless, it is clearly no accident that in the US those most likely to believe conspiracy theories are among society’s lower socio-economic groups. They face the greatest pressures, the greatest feelings of insecurity and are most susceptible to organised campaigns to blame someone for their troubles – the someone else being the ‘elite liberals’ who are allegedly responsible for their situation.

The reality is that their social and economic problems stem from the policies espoused by the rich and powerful who simultaneously fund the campaigns designed to entrench the status quo.

The authors of the belief traps paper say: “beliefs are a key element of healthy cognition. Yet overly rigid beliefs are the basis of societal problems including prejudice, psychiatric disorders, and conspiracy theories.”

“Recent findings from neurobiology, psychiatry and social sciences show resilience of beliefs is boosted by stressful conditions.

“This implies the possibility of self-propelled societal deterioration where rigid beliefs harm the quality of personal and political decisions, evoking more stressful conditions that further rigidify beliefs. Measures reducing social stress, including economic policies such as universal base income may be the most effective way to counteract this vicious cycle.”

The authors conclude that a corollary of their research “is that addressing social factors such as poverty, social cleavage and lack of education may be the most effective way to prevent the emergence of rigid beliefs, and thus of problems ranging from psychiatric disorders to prejudices, conspiracy theories, and post truth policies.”

Who knew?

The blog’s friend John Spitzer drew its attention to the PNAS papers.