The main US government instructor 1101 at Georgia State University in 2021, Mike Evans, had seen his students arrive with less data and more conspiracy theories. Gone were the days when the students arrived at the campus with vague memories of the civic education of the high school. Now they arrived armed with bold beliefs, often misleading, molded by the hours spent every day in Tiktok, YouTube and Instagram.
An example of misinformation that circulated then was an anonymously published video that, according to more than half of adolescents in a national survey, provided “forceful evidence” of electoral fraud in the US. The video was recorded in Russia, a crucial context that could be identified by introducing some keywords in a browser.
Ignoring the problem of online credulity seemed irresponsible, even negligent. How could the course fulfill their objective of helping students become “effective and responsible participants of American democracy” if they disregarded digital misinformation?
At the same time, thoroughly reform a course that enrolls more than 4,000 students every year – with 15 instructors teaching 42 face -to -face sections, online and in hybrid format – represented a logistics nightmare.
It was then that Evans, a political scientist, discovered the online civic reasoning program, developed by the research group that I directed at Stanford University. The program, available for free, teaches a set of strategies based on how professional verifiers evaluate online information.
In the autumn of 2021, Evans approached and raised a question: could aspects of the American study plan 1101 be incorporated without changing the entire course completely? My team and I think so.
Ensure an informed citizenship
Evans’ challenge was not exclusive to his campus.
For generation Z, born between 1997 and 2012, social networks – specially YouTube, Tiktok, Instagram and Snapchat – have become their main source of information about the world, eclipsing traditional media. In a survey of more than a thousand young people between 13 and 18, eight out of ten claimed to meet conspiracy theories in their social networks every week; However, only 39% had received instructions to evaluate the statements they saw in them.
We create our online civic reasoning program to address this gap.
When we launched the program in 2018, digital literacy was a broad concept that covered from the edition and up of videos to cyberbullying and sexting. “Checking the credibility of the sources” was only a criterion among many others in a desired results list.
We focus our program on essential skills to be an informed citizen, such as the “lateral reading”, that is, to use the complete internet context to judge the quality of an affirmation, identify people or organizations that support it and evaluate its credibility.
You may interest you: from the printing press to the algorithm: what do yesterday’s witch hunts and today’s misinformation crisis have in common
Instead of focusing only on the message, we teach students to investigate the messenger: what organizations support the statement? Does the source have a conflict of interest? What are your credentials or experience?
We tested our approach to an experiment in twelve -grade classrooms in Lincoln Public Schools, Nebraska. In six hours of instruction – two less than what an average teenager spends online – students almost doubled their ability to find quality information compared to a control group. We think it would not be a great challenge to extend our approach to university classrooms.
In a version adapted to the Evans course, we design six short modules that could be used asynchronously, which allowed students to complete them at their own pace, regardless of the course format. Unlike the traditional lessons of informational literacy, which go beyond the details of any discipline, our modules were closely linked to the content of the course.
In a unit on the Executive Power, for example, the students analyzed an Instagram video that falsely claimed that President Joe Biden wanted Americans to pay more in the gas station.
In a module on the Judiciary, they saw a video on Tiktok about the confirmation of Ketanji Brown Jackson to the Supreme Court, published by a leftist partisan organization. We created videos that showed how to deconstruct common tactics in political campaigns: out of context, selectively spliced and edited videos, and websites financed by corporations that are passed through base efforts.
We also teach students to verify data as professionals. The main strategy was lateral reading: search the Internet to see what other more credible sources say about an organization or influential person. We also question common assumptions, such as Wikipedia is always unreliable.
This is not true, especially in the case of the “protected pages”, indicated with a lock icon at the top of an article, which prevents editorial changes except those made by consolidated wikipedists. Another erroneous belief is that a website “.org” exceeded rigorous evidence that qualifies it as a beneficial organization, which is never true: “.org” was always an “open” domain that anyone can record without questions.
These lessons occupied only 150 minutes during the semester, and the instructors did not need to change anything; They simply added the lessons to the course schedule.
Positive results, modest effort
Did this approach work for Evans and its US students 1101?
Throughout two semesters of an academic year, 3,488 students carried out an exam at the beginning and the end of the course. This included questions such as evaluating a website that stated “not representing any industry or political group”, but that was actually backed by the interests of fossil fuels.
In June, Evans, two co -authors and I uploaded the predimpression of a magazine article, not yet reviewed by pairs, documented by the experiment and its results. We discovered that, from beginning to end of the semester, the students became much more skilled to identify suspicious and safer sources when evaluating the origin of the information. Student grades improved 18%. Even better, 80% claimed to have “learned important things” from the modules.
It is not bad for an easy incorporation to adopt in the course.
These results are added to other studies that we have carried out, such as one in a university class of nutrition and another in an introduction to rhetoric and writing, which showed in a similar way how educators can improve digital literacy of students (and their conscience of disinformation) without causing significant interruption in the curriculum.
And I think it is necessary. There is a large gap between the approved content that appears in the students’ reading lists and the enormous amount of unregulated, not verified and unreliable information that they consume online.
The good news? This intervention could work in any matter where disinformation proliferates: history, nutrition, economy, biology and politics. Similar findings in other university campuses reinforce our confidence in this approach.
These changes do not require waiting for a great revolution. Small steps can make a difference. And in a world flooded with misinformation, helping students learn to distinguish between reality and fiction could be the most civic action we can perform.
*Sam Wineburg He is Emeritus Professor of Education, University of Stanford.
This article was originally published in The Conversation
Do you use more Facebook? Let us like to be informed