Prebunk misinformation
Is it possible to prebunk misinformation?
Is there a vaccine for fake news?
A recent story on 60 Minutes caught my eye and steered me toward the Social Decision-Making Lab at Cambridge University. The director of the lab, Sander van der Linden, told 60 Minutes that misinformation – that which is outright false or incorrect – represents just a small amount of people’s overall media diet. “The much bigger part is what we would refer to as misleading information, half-truths, biased narratives, information that is presented out of context.”
In collaboration with partners at Yale and George Mason University, the Cambridge lab recently published “Inoculating the Public against Misinformation about Climate Change”. The study is said to be a kind of “psychological vaccine” against misinformation.
A growing body of research suggests that one promising way to counteract the politicization of science is to convey the high level of normative agreement (“consensus”) among experts about the reality of human-caused climate change. … evidence is provided that it is possible to pre-emptively protect (“inoculate”) public attitudes about climate change against real-world misinformation.
There is so much bad information circulating on the internet, much of it placed as commercially driven click-bait. Other sources include politically motivated state actors seeking to disrupt social cohesion. Misinformation and disinformation are motivating many of the most contentious sections of Canada’s Online Harms Act, Bill C-63. The psychological research being led by Cambridge suggests countering bad information with good information. The researchers found that the way we perceive what other groups believe serves as a cue for overall informational judgment. So, conveying facts that scientists and experts are convinced about an issue can increase perceived consensus and acceptance across an ideological spectrum, either directly or indirectly.
The research suggests that communicating a scientific consensus on such issues as vaccines or human-caused climate change should be accompanied by information warning that politically or economically motivated actors may seek to undermine the findings. In effect, audiences should be provided with what they call a “cognitive repertoire” — a basic explanation about the disinformation campaigns — to try to pre-emptively refute such attempts. The research suggests that communicating a social fact, such as a high level of agreement among experts, can be “an effective and depolarizing public engagement strategy.”
According to van der Linden, “everyone is obsessed with influencing each other, but in fact there’s almost no program of research that looks at helping people resist unwanted attempts to persuade them. So that’s where my interest is: helping people resist persuasion when they don’t want it.”
Last fall, I wrote about the cost of misinformation. What if we found ways to prebunk misinformation, inoculate people to be able to detect half truths and lies online?
It seems unlikely that we will be able to block the flow of bad information. Does inoculation represent a better approach, enabling Canada to counter misinformation by censuring, not censoring?