George Lucas Educational Foundation
ChatGPT & Generative AI

Grappling With AI Writing Technologies in the Classroom

ChatGPT and other AI tools complicate how teachers think about student writing and how they teach academic integrity.

January 26, 2023
Goodboy Picture Company / iStock

As a child of the 1990s, I recall arguing with teachers about why I should be allowed to use a calculator to do math and why, with the advent of personal computing, having to learn cursive should go the way of the abacus. 

“Get with the times,” I told my teachers, asking why I should have to perform tasks the old-fashioned way when new technology already made life easier and more efficient.

Fast-forward: Karma hit me when my high school history students told me about ChatGPT, a currently free online platform that, true to its advertising, “is trained to follow instructions in a prompt and provide detailed responses.”

No matter what I ask of ChatGPT, it generates well-written responses in mere seconds, with interesting, if not always accurate, content. Upon researching more about these “writing assistants,” I came across Jasper, which, while charging for access, proved to be even more intelligent and intuitive, but also imperfect.

Experimenting with AI Writing Platforms

Using a free five-day trial, I asked Jasper to respond to my students’ upcoming take-home essay assignment about religion in Harriet Jacobs’s autobiography, Incidents in the Life of a Slave Girl. After just a few quick keystrokes and some fine-tuning of my output parameters, Jasper produced a 1,500-word essay, including supposedly legitimate quotations from the narrative, with in-text citations. Keep in mind that with additional time and fidgeting, Jasper would have delved with even greater depth into any area. 

Examine the body paragraph below, which includes a clear topic sentence, accurate analysis, solid diction, and tight sentence structure:  

Furthermore, reliance on Christian principles such as compassion and forgiveness becomes a vehicle by which Jacobs navigates her life as a slave. She expresses this idea when describing her master Dr Flint’s cruelty towards her: “The better feelings which nature had given him were dead.… Here then was my home—my cruel home—made desolate by the demon Slavery! Yet heaven blessed it with one true friend—my good old grandmother! She prayed for me continually… her prayers kept alive my Christian virtues” (Jacobs 30–31). These Christian values—such as understanding even amidst hatred—help elevate Jacobs’s spirit despite her oppressive conditions, allowing her to continue fighting for freedom without becoming consumed by bitterness or rage.

However scholarly and convincing this paragraph may sound, be wary! Trust me: I speak from experience, as a high school history and journalism teacher of 16 years who also writes about media literacy and fake news

I almost signed off on this story getting posted without verifying Jasper’s use of text evidence, which, as it turns out, does not exist in Incidents in the Life of a Slave Girl. In fact, not only did I come up with bupkis after searching for matching words and phrases in an electronic version of the book, but a Google search also revealed no exact matches. 

I should have known better, but if my ordeal encourages teachers and students not to be fooled by AI and scholarly-sounding prose, I happily fall on my sword.

Jasper did translate portions of my essay into near-flawless Spanish, as two Spanish teaching colleagues attest. The platform boasts that it can do just as good a job in another 24 languages. Furthermore, Jasper advertises that it “will almost always generate unique content, particularly for longer content.” This raises a whole host of ethical issues and questions for teachers, students, administrators, and everybody else involved in producing and consuming content.

Implications for Educators

Witnessing the power of this technology firsthand, I’m left with many burning questions as an educator, many of which are difficult to begin to fathom: With AI generating unique content, how can teachers encourage and enforce academic honesty? If it is extremely easy to cut corners, avoid plagiarism-detection software, and still submit quality work, how and to what extent can educators persuade students to put in hard work the old-fashioned way? 

Along those lines, I dare to ask how AI will change what educators expect students to know and be able to do. For example, how will AI change how we assess student learning and growth? Will educators give fewer take-home assignments while having students produce more of their output in class, under direct supervision? Will we have to rethink the need to teach penmanship or invest in proctoring software? What does more time devoted to in-class work mean for how much content teachers can cover each year, or for a particular subject?

Should teachers entirely shun AI, dismissing it as an easy and attractive way for students to get out of doing real work, which is needed to really learn and grow, or should teachers acknowledge its existence, experimenting with and researching how it can aid learning and growth? Might it enhance accessibility for students with disabilities? What would a healthy use of this software look like, and who should act as the arbiter? 

I am extremely excited about what AI could mean for modeling effective writing and research. And yet, I, like so many other teachers, am stuck wondering who should ultimately be held responsible for content produced by AI. If AI misquotes, plagiarizes, or does something else nefarious, to what extent is the student user also responsible? Along with the student, how and to what extent should a particular AI platform be held accountable? What factors should administrators consider in trying to get to the bottom of any error or oversight?

To make another 1990s reference, I don’t think that platforms like ChatGBT and Jasper indicate the rise of Skynet, a super-intelligent AI from the Terminator films, which gains self-awareness and reduces humanity to a postapocalyptic hellscape. As I have seen in my own classroom, technology can open unexpected opportunities for learning.

Technology as a Facilitator of Learning

I often project responses to a given prompt in the classroom, but AI could take this practice to a whole new level. While I do not want my students to feel tempted to copy and paste a response, there is significant value in asking students to observe and study stellar writing in action. I am already thinking about how AI could save me time to cover more content with students—for example, by generating mentor texts to showcase for whole-group discussion.

In 2023, we still debate the vices and virtues of students using calculators in most math and science settings. Yet the near-ubiquitous acceptance of this technology conveys a fundamental truth: No matter our wishes, content-generating AI is here to stay. 

As we have always done, teachers must navigate with students how to make effective and ethical use of this new technology. It’s way too early to know how to do this or what this entails, though one thing is for certain—if we do nothing but lament how AI spells doom for education, we lose a sacred chance to help guide students into a brave new world. 

Share This Story

  • email icon

Filed Under

  • ChatGPT & Generative AI
  • Education Trends
  • Technology Integration
  • English Language Arts
  • Social Studies/History

Follow Edutopia

  • facebook icon
  • twitter icon
  • instagram icon
  • youtube icon
  • Privacy Policy
  • Terms of Use

George Lucas Educational Foundation

Edutopia is a free source of information, inspiration, and practical strategies for learning and teaching in preK-12 education. We are published by the George Lucas Educational Foundation, a nonprofit, nonpartisan organization.
Edutopia®, the EDU Logo™ and Lucas Education Research Logo® are trademarks or registered trademarks of the George Lucas Educational Foundation in the U.S. and other countries.