There are two very encouraging developments on the fake news front. The first is the emergence of a fail-safe system of evaluating the truth about a claim. It works very simply – if a politician or anyone else immediately brands any claim, accusation or information ‘fake news’ one can automatically conclude that the claim, accusation or information is true and correct.
The second, is rather more subtle but possibly more important. A recent paper by Stephan Lewandowsky and Ulrich Ecker (University of WA) with John Cook of George Mason University – Beyond Misinformation: Understanding and coping with the post-truth era – provides a very useful summary of the current ‘post-truth’ situation; why stating the facts is not enough to combat misinformation; and some suggestions both for immediate action and further research into what actually works in countering misinformation.
The authors make two very powerful points. First, misinformation is often the product of well-financed decades’ long campaigns such as the climate denial campaigns run around the world by big companies, think tanks and lobby groups. Second, “a notable segment of the American public now subscribes to a non-standard epistemology that does not meet conventional criteria of evidentiary support.” In other words post-truth nonsense, frequently reinforced by politicians and the media, becomes self-perpetuating.
The authors say their paper: “explores the growing abundance of misinformation, how it influences people and how to counter it. We examine the ways in which misinformation can have an adverse impact on society. We summarize how people respond to corrections of misinformation, and what kinds of corrections are most effective.
“We argue that to be effective, scientific research into misinformation must be considered within a larger political, technological, and societal context. The post-truth world emerged as a result of societal mega-trends such as a decline in social capital, growing economic inequality, increased polarization, declining trust in science, and an increasingly fractionated media landscape. We suggest that responses to this malaise must involve technological solutions incorporating psychological principles, an interdisciplinary approach that we describe as ‘technocognition’.”
Perhaps one of the most disturbing findings canvassed in the paper is that there is “evidence that the presence of misinformation causes people to stop believing in facts altogether.”
The authors provide an excellent summary of the growing research into how people respond to corrections of misinformation – in a word irrationally – and why simple corrections are rarely fully effective. And if you need any further evidence – beyond the election of Trump – that a significant number of Americans are captives of irrational and ignorant attitudes the authors trace some of the persistent memes such as NASA’s child slavery colony on Mars; the Democrats child sex trafficking from a pizzeria basement in Washington; ‘birther’ beliefs about Obama; and the UN plot to install a world government. The last, of course, a particularly long-standing US belief. The extent of the problem is demonstrated by some 2016 research by the aptly named Kafka (but with given name initial P) indicating that the child sex ring meme was believed or accepted as being possibly true by a third of Americans and nearly half of Trump voters.
As for antidotes to the problem the authors discuss quite a few mainly because there is obviously no one solution. One important recommendation – which was been discussed independently by the ABC’s MediaWatch – is the need for the media to be much more rigorous about getting thinks tanks, corporate front groups and others to disclose affiliations, interests and funding.
Longer-term suggestions are around community engagement and making voices heard – particularly among scientists; algorithmic fact checkers such as Google’s proposed web-based app that combines machine-learning and artificial-intelligence technologies to help journalists fact-check; and longer-term cognitive research which would help overcome post-truth by information inoculation strategies; information discernment education programs in schools (in other words old-fashioned critical thinking skills); the use of ‘nudge’ theories to nudge people to the truth; and the recognition that improved communication or more and better information will not be enough to solve the problem.
In their conclusion the authors talk about how the hope that the Internet would usher in “a new era of cultural and political democracy” has been “as false and as amusing (or tragic) as the predictions from the 1950s that nuclear power would make electricity so plentiful that it would ultimately be given away for free.” Most importantly they stress that post truth misinformation strategies – such as climate denialism – have “arguably been designed and used as a smokescreen to divert attention from strategic political actions or challenges……a rational strategy that is deployed in pursuit of political objectives.”
On a positive note they suggest “post-truth politics may cease when its effectiveness is curtailed.” For those who think this is hopelessly optimistic it is worth remembering that it can happen even if it sometimes takes a long time and happens in fits and starts – for instance the 500 years since the universal power of the Catholic Church in Europe was broken and the centuries since the Scientific Revolution. In our time the realities of climate change will have a similar effect.
Incidentally two of the authors have written a handy little Debunking Handbook which is available as a free download in 10 languages as well as in a PowerPoint presentation format. It is a useful guide to dealing with misinformation. The blog feels a bit reluctant to be seen recommending a PowerPoint version given its attitude to the distortions they can create (see frequent blog mentions to Edward R. Tufte) but this one is different.