Real, Fake, or Deepfake? This Lesson Helps Students Decide
Students examine videos and online information to investigate what is real and what is not in this engaging lesson.
Your content has been saved!
Go to My Saved Content.The digital world is filled with misinformation and AI-generated images and videos—called deepfakes—making it increasingly difficult to determine what is real and what is fake. To better set our students up for success in an increasingly complex digital world, we wanted to find a way to help them investigate the information around them.
We created a 60-minute lesson in which middle school students use observation and reasoning to evaluate four short, kid-friendly videos, as well as two websites, to determine which are intentionally misleading. In the process, students learn strategies to identify misinformation and the importance of using evidence to defend their thinking.
Starting the Lesson: The Hook
The lesson begins with a simple question: “Who knows anything about cats?” Hands shoot up, and the class watches a six-second segment of this video of a cat standing at the end of a diving board, springing up, and doing an elaborate, Olympics-worthy dive into a swimming pool. The clip is shown again in slow motion. To start a class discussion, students are asked, “What were your observations and what were your inferences?” Unanimously they conclude that it must be AI-generated.
When asked what brought them to that conclusion, students share a range of evidence, from their knowledge of cats who hate water to their gut feeling that something seemed off. We discuss how prior knowledge and trusting your intuition can help you detect misinformation, but there are also strategies that provide further evidence.
Diving Deeper: Assessing Information From Varied Sources
Next, as a class we watch four video clips; three are intentionally misleading, some with AI-generated images, and one video is factual. The four clips range from 13 to 30 seconds and include the following (links to the clips are included in the Google Doc for students referenced below):
Students watch the clips again independently with this document shared via Google Classroom. To avoid giving away clues through video descriptors or comments, the videos are shared as screen-recorded clips as opposed to showing them directly from the YouTube listings. Students have 10 minutes to decide which video is real and which three have been altered. To guide their thinking, they’re given a sentence stem, “I think video #1 is real or fake because…,” which emphasizes the importance of evidence.
After the timer goes off, students move to one of four corners of the room to indicate which video they believe is real. In every class, no one chose the celebrity pitch, but the other three videos were evenly divided. In their corners, students discuss their reasoning before returning to the whole group.
The truth about each video is then revealed using this teacher’s version that reveals the tricks, including how a deepfake is made, and strategies to identify misinformation, such as reverse image searches.
Putting it into Practice: Applying the Skills
Students then use their new skills to determine which of these two websites is filled with fake information:
Students are told to be both detectives and lawyers, gathering evidence to defend their opinion. As students dive in, the room is filled with energy and curiosity. Students ask questions like these:
- “Can I do some side research?”
- “Can I try a reverse Google image search?”
- “Can I Google Tasos Kokkinidis and see if he’s real?”
All of these suggestions are encouraged and are great examples of determining a source’s reliability. While students are working independently, they’re also collaborating with nearby classmates. Comments we overhead include:
- “Google translate that; it’s a fake language!”
- “Can an octopus only have seven legs? Wait, don’t they live in the water?”
- “Look up Kelvinic University! I don’t think it’s real.”
Engagement is high—students are both entertained and shocked to see the level of misinformation on the Pacific Northwest tree octopus website. When it’s time to wrap up, students ask for more time before finding out the answer. We tell them we’ll share evidence at the start of class tomorrow, and a third of the students choose to continue to explore the websites at home and gather more evidence.
At the start of the next class, students come in eager to share their evidence that the tree octopus isn’t real. This includes side research on octopuses, proof that the Latin name for the Pacific Northwest tree octopus isn’t real, reverse Google Image searches, and analysis of the tree octopus website’s photos and claims of sightings.
Afterward, we explain that this website was designed to teach about misinformation, and we list all the approaches students used to determine that outcome. Many students also share that if they hadn’t had prior knowledge about the anaconda from the earlier activity, it would have taken more research to figure out which was real. This emphasizes the importance of prior knowledge.
Why This Activity Works
The short videos and websites are engaging to students, as is the challenge of deciding what is real and what is not. Students especially enjoy videos about animals, and many middle school students are interested in online “influencers,” including celebrities. Many skills are taught in this lesson about observation, inference, identifying misinformation, and the need for evidence. This lesson works especially well when each student has a device to use and when technology is in place, like Google Classroom, that allows resources to be easily shared.
Because of the high engagement of students, these concepts are more likely to be retained as teachers reference the skills learned in later classes. As a result of this lesson, students realize on their own that the more you know, the harder it is for you to be tricked by misinformation.
