How can we teach children to spot fake news?
Everyone agrees that we have to teach young people to spot fake news – a need highlighted recently by the deluge of misinformation about the pandemic. But how teachers should go about doing this is hotly debated. Some believe critical-thinking skills are the answer, while others say subject knowledge is the key. Holly Korbey finds that neither approach is likely to do the job on its own, and proposes a third way…
Fake news doesn’t take a break, not even for a global pandemic. Over the past few months, while much of the world has stayed at home, online misinformation has continued to churn, with one of the biggest conspiracies regarding billionaire Microsoft creator and philanthropist Bill Gates.
Using Gates’ past speeches, op-eds and statements warning of a worldwide pandemic (as well as his promises of funding for potential vaccines) as “evidence”, right-wing media and conspiracy groups such as QAnon have twisted his words and accused him of secretly creating Covid-19 in order to make millions from a vaccine. According to The New York Times, the Gates coronavirus conspiracy quite literally went viral; it’s the largest piece of misinformation currently circulating on the internet, and has been mentioned on social media and television more than 1.2 million times.
“Gatesgate” is just the latest conspiracy theory to race through the internet. Misinformation, outright lies and media distortions have always been with us, but the digital age has given misinformation a lethal boost. Streams of misinformation have made it difficult for adults to know what’s true online, but young people fare even worse. Research shows that a majority of young people – middle, high school and even university students – are ill equipped to identify what’s true online.
With students being flooded with streams of information through their news apps and social media feeds, and unable to distinguish what’s true from what’s false, misinformation experts such as Mike Caulfield, who runs the Digital Polarization Initiative, say that by the time they reach undergraduate level they have got to the point where they believe nothing is true.
“With information overload, what we see happening is that students have low trust in everything,” Caulfield told me in an interview for my book on the “new” civics, Building Better Citizens. “They don’t know how to sort true from false in an effective and efficient way, so their defensive strategy is to trust nothing…Show them a prompt from a reputable newspaper and something fake, and it’s low trust for the truthfulness of both, across the board.”
Since 2016, “media literacy” programmes have been launched in schools to try to combat the misinformation tsunami, but because the issue is so new, there’s little evidence of what works. Government in both the US and the UK has recognised the problem and provided supportive legislation for media literacy as part of a larger movement to reinstate civics education – 21st-century style – in schools. And yet practitioners have struggled to define what media literacy is and to employ curricula and teaching methods that are effective.
What does not help is that, young though the field may be, within media literacy programmes two opposing schools of thought have already arisen on the best way to help students to become news and media literate. One relies on teaching independent “critical-thinking” skills that would ostensibly give students the ability to think their way through information: deconstructing media messages, questioning where information originated. The other relies on building “background knowledge” to combat misinformation. In the UK especially, this tension between critical thinking and background knowledge has spurred a debate about the best way to equip students to recognise misinformation.
Taking the Bill Gates conspiracy as an example, which set of skills would work best in discovering the truth? The Gates videos being presented as “evidence” of foul play are real. Bad actors used their online research skills to take available facts to “connect the dots” in ways that they believed no one had yet noticed (probably met with pleas to “open your eyes” to the “real truth” ) to fabricate an entirely new story. Would critical-thinking skills or background knowledge be the fastest way to the truth?
Or what if the solution was more like “none of the above”?
The more you know
Let’s take knowledge first. After a couple of decades focusing on teaching critical thinking as a standalone skill – part of a suite of “21st-century skills” meant to prepare students for a fluid, tech-based economy – the pendulum has begun to swing back towards teaching students facts. Popular books like Natalie Wexler’s The Knowledge Gap have argued that schools fail students because a lack of basic content knowledge in areas such as history, science and the arts keeps them from developing the exact kind of thinking skills society wants students to have.
And though educating young people to be critical thinkers is a worthy and universal education goal, cognitive scientists say critical thinking can’t be taught as a standalone skill. It requires content knowledge in a subject area to do the kind of reasoning – looking at both sides of an issue and coming to solid conclusions – that is critical to success and life in a democratic society.
“People who have sought to teach critical thinking have assumed that it is a skill, like riding a bicycle,” wrote cognitive psychologist Dan Willingham for the US teacher union AFT, “and that, like other skills, once you learn it, you can apply it in any situation. Research from cognitive science shows that thinking is not that sort of skill.”
Willingham points out that if you ask a student to think about a topic from “multiple perspectives” but they don’t know anything about that topic, that task will be impossible, because thinking is intertwined with knowledge about whatever you’re thinking about. In order to think deeply, in the way schools want students to be able to do, they will first need some kind of content-related background knowledge about the topic.
Background knowledge is certainly necessary for students’ critical thinking, but when it comes to evaluating information on the internet, having enough might be a challenge. The scope of the information available on the internet is too wide, and individual knowledge is necessarily too narrow, to be able to correctly evaluate all the topics one could encounter.
I’m an education journalist, and my area of expertise is schools and learning. If I encountered an article on teaching children to read, there’s a good chance I would be able to evaluate its accuracy – after all, I’ve written lots of stories about reading, and I read education news every day. But what about an article on the Syrian civil war? Or Bill Gates and the development of a coronavirus vaccine? I would never have enough background knowledge to evaluate those correctly. I would need something more than background information to know whom to trust.
Are generic thinking skills the “more” that we require? Teaching students to think critically about what they are reading online is at the heart of many media literacy programmes, and the premise would seem to make sense. For example, the popular CRAAP test requires students to evaluate a website’s currency, relevance, authority, accuracy and purpose by asking themselves a list of questions. Students are instructed to read the website closely, check whether links function and look at the author’s credentials, among other tasks.
But when Sam Wineburg, head of the Stanford History Education Group, tested whether critical thinking helped professors and Stanford undergraduates – two groups adept at critical thinking – to evaluate a website’s validity in a few minutes as part of an experiment, the critical thinkers failed.
The two groups were asked to evaluate quickly whether an article on bullying from the American College of Pediatricians was a reputable one. The website had clean graphics and a logo, updated links and the article was written by a doctor. Both the undergraduates and professors agreed that the site and the article were credible.
But what the professors and undergraduates didn’t realise in their evaluation of the site’s graphics and the doctor’s degree was that the website represented a splinter group, broken off from the American Academy of Pediatrics, which promoted anti-gay views and had been labelled by the Southern Poverty Law Centre as a hate group.
What went wrong? No doubt the students and the professors are both intelligent and adept at critical thinking. The problem, Wineburg said, was with the critical thinking itself. “Their big error,” he wrote, “was applying strategies that work in an analogue world to a world where they don’t.”
So, where does that leave us? Fortunately, when Wineburg and the Stanford team went looking for groups who consistently performed well at evaluating online truthfulness, they hit a result: journalistic fact checkers – the people who comb articles in newspapers and magazines such as The New Yorker for accuracy – had a track record for sniffing out the truth, and they weren’t using critical thinking.
Instead of staying on the article’s web page, after a quick scan, fact checkers immediately opened a series of tabs to check what else was available on the internet about the topic. Instead of reading deeply, fact checkers read laterally, putting a single article into the context of the “web” of the internet to see what else had been written about the topic and who was behind creating the website.
Fact checkers read less than the professors and undergraduates, but learned more in less time. In the case of the American College of Pediatricians, fact checkers used the footnote references in a Wikipedia entry, finding complaints about the group and its anti-gay stance, making the article less reliable.
Let’s face facts
Reading laterally isn’t fact checkers’ only method. Caulfield, who wrote the go-to guide for learning the fact checker method, Web Literacy for Student Fact-Checkers, names three other “moves” that fact checkers use regularly: checking for previous work on the topic, to see whether a claim has already been fact checked by another source; heading “upstream” to the original source of information using hyperlinks provided in an article; and, when they find themselves lost down a rabbit hole, coming back to the original article and starting again.
Stanford’s research showed that fact checkers had better success at arriving at the truth than critical thinkers, so Stanford History Education Group created the Civic Online Reasoning curriculum that provides free, evidence-based lessons and assessments for middle- and high-school students to teach them the fact checker’s moves.
Wineburg sees learning this method as a bit like getting a digital driver’s licence: knowing the basic rules of the road doesn’t mean you’ll never make a mistake, but it increases the likelihood of getting to a destination safely.
And never has doing less critical thinking been more important, he says.
“We’re faced every day with social media posts and stories that inflame emotions and extinguish our ability to rationally think,” Wineburg tells me in an interview. “What’s at stake here is liberal democracy.”
Indeed, a few years ago, technology and social media scholar Danah Boyd suggested that teaching critical thinking as a solution to misinformation was doing more damage than good. By spending too much time asking students to question and break down media messages to look for bias or motives, she said, teachers can give students the unintended message that there is only one acceptable version of the truth – an idea that can easily be turned on its head and weaponised by extremists looking to recruit young people wanting to question the status quo.
These ideas can take hold even more easily at a time when trust in media is historically low. Can’t you hear one of Bill Gates’ online accusers asking followers to question what they’re seeing right in front of them – a billionaire claiming that he wants to protect the world from a global pandemic? Could that possibly be the whole story? “Open your eyes to what’s right in front of you, sheeple!”
Or as Amy Davidson Sorkin wrote recently in The New Yorker : “In dysfunctional political cultures, much like the present one, there is a conspiratorialist feedback loop: the less you trust, the more you search for alternative authorities, and the more susceptible you are to untrustworthy figures who maintain their position by attacking what is true.”
So, in addition to teaching fact-checking procedures, organisations such as the News Literacy Project seek to combat online misinformation by helping students understand quality journalism – teaching “news literacy” instead of media literacy. Through online-based lessons featuring journalists and other content experts and teacher professional development, they’re teaching middle- and high-school students how standards-based journalism is produced, and the ways it differs from user-produced content on YouTube or TikTok. Peter Adams, senior vice-president for education at the News Literacy Project, says that a part of its mission is working to reduce the widespread teen cynicism about media that often makes them vulnerable to believing all information, including propaganda and conspiracy theories, is equally credible.
“Students are often sharing and believing in things coming from RT [the 24-hour, state-controlled Russian propaganda outlet with the tagline ‘Question more’] because they don’t know the difference between the BBC or NPR [the US non-profit National Public Radio] and RT,” Adams says. “They’re all state funded, but they have very different standards, and very different accountability.”
Other groups hope to reach young people by inviting them to do the work of journalism themselves, from fact gathering to interviewing eyewitnesses and experts – a project-based approach that lets students see how journalism is made by producing stories of their own for a real audience.
The Journalistic Learning Initiative, a collaboration between the University of Oregon’s school of journalism and communication and its college of education, works directly with English and social studies classrooms to help students develop reporting projects that spark their interest, in issues such as racism or animal cruelty, and teach them how to develop a story around them.
It’s the organisation’s hope that through producing news stories, students will learn to tell the difference between news, opinion and “infotainment”, which are often blurred together online. “That’s part of the problem – we’ve lost this distinction,” says co-founder Ed Madison. “We don’t have a culture that reads newspapers, but if it was on the front page, it was news, and opinion [would be] on the editorial page. There’s a clear delineation that doesn’t exist on the internet and television.”
Clearly, combatting misinformation and restoring trust in media to beat the conspiracy machine isn’t a simple process and doesn’t yield to simple solutions. But a functional society requires public trust, and schools must be part of the solution.
And we need to find that solution quickly. While education systems scramble to figure out the best way to help students, the swirl of conspiracy surrounding Bill Gates will soon be replaced by another, perhaps bigger and more threatening conspiracy theory. What the evidence has shown us so far is that we can’t critically think our way out.
Holly Korbey is a journalist and the author of Building Better Citizens. Her work has appeared in The New York Times, The Washington Post and The Atlantic, and she is a regular contributor on education for KQED MindShift and Edutopia. She tweets @hkorbey