As a learning specialist who, along with supporting the students in my charge, works side by side with colleagues in all content areas, I want to encourage faculty as they consider the impacts of tools such as ChatGPT. For English and social studies teachers, for example, this just might be their “calculator moment”: As math teachers had to adapt their teaching when calculators became ubiquitous, so, too, must teachers of courses that are rich with reading and writing, now that AI tools are at students’ fingertips.
December 2022 saw a spike in the number of articles, blog posts, and think pieces musing about what ChatGPT and the like might—OK, do—mean for education. Many rang alarm bells, and a few pointed out the shortcomings of the tool as currently configured. I spoke to our division’s director of technology just before our winter holiday about what’s being talked about in his professional circles regarding these artificial intelligence tools.
It was a lively conversation with far more questions than answers. I did respond with certainty, though, when he asked me what my philosophy is about ChatGPT and tools like it. “If I could turn back time and make sure we do not invent such things, I would.”
My response gives away the fact that I’m what can only be described as a Luddite. My watch only tells time. I listen to records. Not too long ago, I had a Nokia phone. Still, I’m no ostrich. I continued, “But that horse has already left the barn. ChatGPT is not coming to our classrooms—it’s already here.”
Here are some ideas for incorporating AI tools into our classes while retaining and reinforcing critical thinking skills.
Zeynep Tufekci, a New York Times opinion writer and associate professor at the University of North Carolina, warns that, as tools like ChatGPT are training themselves on everything written on the internet, there’s a fairly good chance that there will be inaccuracies in what is being drafted. She describes the risk of being subjected to “a high-quality intellectual snow job.”
There can be an upside, however. In a social studies classroom, students might craft a prompt about a topic they’ve been considering and then examine the machine’s response in forensic detail. This may involve a sentence-by-sentence dissection of what the AI has written. By unearthing possible inconsistencies or straight-up inaccuracies, students reinforce their correct understanding of the topic.
Find the biases
As a variation on the above, students create prompts across a variety of related topics, then evaluate them looking for themes of inaccuracy. Since ChatGPT and their ilk are trained on what exists, underrepresented writing and voices will necessarily be marginalized further through the iterative processes inherent in AI tools.
How can we guide our students to not only recognize these blind spots but also incorporate a greater multiplicity of viewpoints in their scholarship?
Spot the computer essay
John Warner, blogger for Inside Higher Ed and author of Why They Can’t Write: Killing the Five-Paragraph Essay and Other Necessities, reminds us that these tools are only rearranging words and not applying feeling, heart, or other emotional components; the results lack energy. In a language arts classroom, we might describe such a piece of writing as having a weak “voice.”
For a playful exercise, share two or three pieces of human writing from the past year or two and slip in an example from ChatGPT, and have students discuss what makes these examples human—or decidedly not. Nuance, passion, and, perhaps, even fallibility will be clues that students can investigate.
Use primary sources and first-person accounts
This and the following ideas are writing assignments that will make the use of AI tools superfluous, if not counterproductive.
My colleague Joy Xu in the English department asks students to interview someone and connect this person’s experience to a text. First of all, the narrative/expository combination provides more points of entry for student writing.
Second, this synthesis ensures that no machine prompt can provide language to express new learning.
Compare two readings
Peter Greene, a senior writer for Forbes magazine and a high school English teacher of nearly 40 years, suggests that we require students “to compare and contrast a pair of literary works (provided the comparison is not a time-worn one).”
Students will create unique essays that demonstrate their learning, the connections they’ve made, and the meaning. Again, because these AI tools are trained on what can be found on the internet, the rarer the story, the rarer the computer-generated analysis.
Expect classroom discussion to be used as a resource
This idea, loosely based on another Peter Greene suggestion, considers a key component of what he calls “authentic writing” class discussions. He recommends that the discussion “[become] one of the texts being considered.”
A rather positive result that would likely emerge from this idea would be students paying closer attention during classroom conversations. Dare I say they may even take notes?
In The Chronicle of Higher Education, Beth McMurtrie shares that students often submit writing that they consider little more than “an assemblage of words repeated back to the teacher,” while we teachers believe that student writing represents their thinking. We want our students to think and to continue to develop their thinking.
Tufekci reminds us in her opinion piece that essay writing teaches students how to conduct research, judge claims, synthesize knowledge, and express that knowledge persuasively and coherently. She continues, “Those skills will be even more important because of advances in AI.”
Rather than bury our heads in the sand, let’s equip our students for the world of right now and for what it might be going forward.
I’m still going to listen to records, though.