I am the family cynic, a title I wear as a badge of honour.
As I have written before, I am often asked to check out a story being shared on Facebook, an email being circulated, a link from Twitter. Is it true?
I tend to read the news with a critical eye. I subscribe to newspapers with opposing viewpoints and frequently read opinion pieces that make my blood boil. And I tend to follow links and check footnotes to see if references and studies are being accurately portrayed.
In other words, I seek information from outside the echo chamber. Unfortunately, most people don’t. A couple years ago I wrote about this in “Reading just what we want or what we need?” At the time (a few months before the last US election), I observed “In the absence of opposing viewpoints, partisanship continues to be amplified. People watch just the programs that serve up the same viewpoints; read articles from sources that reinforce the opinions already held.”
Social media networks have been forensically examining the use of their platforms by foreign actors to determine how political outcomes in elections and referenda may have been influenced, including the last US presidential election and the Brexit vote in the UK.
This is not really a new phenomenon; propaganda has gone digital and propagandists have learned how to succeed in generating messages that go viral using the same techniques that get hundreds of millions views overnight for some videos.
The digital world has lowered the cost and accelerated the spread of propaganda, but have we done enough to teach people how to recognize the tell-tale signs of misinformation?
Wikipedia says “fear uncertainty and doubt” (FUD, for short) is “a disinformation strategy used in sales, marketing, public relations, politics and propaganda. FUD is generally a strategy to influence perception by disseminating negative and dubious or false information and a manifestation of the appeal to fear.”
Today’s highly partisan social networks are filled with countless examples of FUD and other propaganda techniques, such as those we see deployed by a number of shell organizations masquerading as advocacy groups online.
Media releases and twitter streams of certain groups often show tell-tale signs of some of these techniques (among others):
- Loaded language: use of words with strong emotional implications to influence the audience
- Managing the news: sticking to a few points and repeating them over and over
- Ad nauseam: tireless repetition of an idea such that it may begin to be taken as the truth
- Name calling, Demonization of the enemy: Making those who support the opposing viewpoint appear to be subhuman
- Ad hominem attacks: attacking one’s opponent, as opposed to attacking their arguments
While a number of democracies are examining how they can control the use of social media platforms in order to limit the influence of foreign states in domestic affairs, we need to keep in mind that propaganda doesn’t have to come from foreign actors to have negative influence on domestic democratic institutions. Clictivism, fueled by calls to action by the propaganda practitioners, attempt to influence politicians and generate overwhelming volumes of submissions in regulatory consultations, as though quasi-judicial hearings are meant to be determined by popular vote.
Media awareness training can help train the public to watch for the use of these (and other) propaganda techniques being practised by those manipulating and mobilizing public opinion on social networks.
Last summer, the Canadian government announced $50M in funding for CanCode, a program to give 500,000 students from kindergarten to grade 12 the opportunity to learn computer coding skills. Yesterday, the government announced almost $30M to fund a Digital Literacy Exchange Program.
The Digital Literacy Exchange program will invest $29.5 million to support initiatives that teach fundamental digital literacy skills to Canadians who would benefit from participating in the digital economy. The program aims to equip Canadians with the necessary skills to engage with computers, mobile devices and the Internet safely, securely and effectively.
The FAQ page for the Digital Literacy Exchange Program indicates that the target audience are new computer users, or people who are not online today, such as: Persons with disabilities, Indigenous people, Individuals who have not completed high school, Residents of rural and remote areas, Language Minorities, low income individuals, seniors and newcomers to Canada.
Are we doing enough to provide citizens of all ages with digital literacy skills, helping them recognize problematic content on the internet? The vision of Media Smarts is to provide children and youth with “the critical thinking skills to engage with media as active and informed digital citizens.”
We need to ensure that such skills are developed in adults as well. All Canadians need to be able to apply critical thinking skills when engaging with the types of disinformation that have become pervasive on social media platforms.
Coding skills might be useful for some of our kids; basic digital literacy skills are important for the Canadians who are not yet online. But there are other, more fundamental online literacy skills that need to be mastered by everyone in a modern digital democracy. Learning a little more cynicism; teaching how to read with a more critical eye. Helping people detect when stories simply don’t seem right; helping them develop a more sensitive ‘smell test’.
I have asked before, “How do we encourage reading alternate perspectives, consideration of dissenting viewpoints, and engaging in cooperative dialog?”
In a modern democracy, people of all ages, and from all walks of life, need “the critical thinking skills to engage with media as active and informed digital citizens.”