What Students Lose When They Read AI-Adapted Texts
There is wisdom in human-crafted words, and it is hard-earned
The Important Work is a space for writing instructors at all levels—high school, college, and beyond—to share reflections about teaching writing in the era of generative AI.
This week’s post is by MG Prezioso. It originally appeared in The Boston Globe. MG is a postdoctoral researcher and instructor in the Harvard College Writing Program. You can learn more about MG on her website or on LinkedIn.
If you’re interested in sharing a reflection for The Important Work, you can find all the information here. Your reflection does not have to be about using AI in the classroom—we are interested in any way that you’re thinking about the important work these days. If you’ve redesigned assignments to avoid using AI, if you have strong feelings about when and how AI should or should not be used in the classroom, if you do something that you think works very well without AI, we want to hear about that too. —Jane Rosenzweig
Chances are, if you’ve seen a movie adaptation of a book you’ve read, you’ve left the theater saying “I can’t believe they took out X!” or “I wish they didn’t change Y!” I am still indignant, for example, over the burning of The Burrow in the sixth “Harry Potter” film. (It wasn’t in the book! So unnecessary!) When we’re talking about a book that’s been altered by Hollywood, this outrage makes for excellent dinner conversation. But what if we’re talking about a book that has been modified by AI for classroom use?
Known as AI-adaptive, or AI-adapted, texts, these materials are slowly gaining traction in K-12 education. According to a RAND research study published in February, 17 percent of US teachers in a nationally representative sample reported using AI tools like ChatGPT to generate or adjust instructional materials. Education technology and mobile app companies, too, have developed AI-assisted platforms capable of adapting a range of texts, from news articles and historical speeches to classic novels and plays. So rather than reading “The Gettysburg Address” or “Romeo and Juliet,” students could be reading an AI-adapted version of those works instead.
As an education researcher, I understand the appeal of AI-adapted texts. Classrooms play host to students with a range of language and literacy skills, and AI-adapted materials, which can be translated and tailored to each individual’s reading level, allow students to access the same content — along with supplementary resources, like discussion questions and vocabulary words — at their own pace. This is especially valuable in social studies and the sciences, where information is a prerequisite for conceptual understanding.
However, I also have concerns about AI-adapted texts — concerns I believe are worth considering if we are to leverage these tools responsibly.
First, AI-adapted materials may not be the best way to improve reading outcomes. Students need access to complex texts with varied sentence structures and sophisticated language to support reading comprehension development. AI-generated writing, by contrast, can be syntactically repetitive and stylistically flat. Will AI-adapted materials really boost students’ literacy skills? Why not use authentic texts with additional instructional support, like drawing on background knowledge or helping students break down meaning-filled, “juicy” sentences instead?
There is also the question of artistic integrity. If we are modifying an informational text to be more engaging or to help students acquire content, that’s one thing. But AI adaptations of fiction? Essays and memoirs? Can we really trust AI, with its racial and gender biases, to adapt a novel like “Beloved” — one that embodies not only Toni Morrison’s lyrical, enchanting style but also the complexities of the Black experience?
Most of all, though, I worry about how AI-adapted texts will affect students’ love of reading.
I research children’s “story world absorption”: how young readers become hooked, or immersed in books. Evoking memories of warm summer days spent traversing magical worlds, absorption is a state of enrapture and enjoyment — one that cultivates a lifelong appreciation for reading and enriches our understanding of ourselves and the world.
Absorption depends on many factors, from students’ ages and extracurricular interests to genre and text type. One of absorption’s most important determinants, however, is language — what literary scholar Rita Felski calls the “irresistible combination of word choice and syntax.” From sensory imagery to character dialogue, words are the vehicle for absorption, transporting us to the world of the story and deepening our investment in it.
What’s more, divorcing a story from its style is like separating humans from atoms. You can try. But if you were to succeed, you would create something entirely different.
If we use AI to adapt existing works and the modified versions are not irresistible but repetitive and flat, we can hamper students’ ability to become absorbed in texts. Most children already view reading as a chore. The amount of time children spend reading for fun continues to decline. Some students aren’t even reading whole books in school. Will AI exacerbate this trend?
Teachers could conceivably train AI models to imitate an author’s style, as some novelists have done with mixed success. But training a model to adapt an entire work would be time-consuming. Plus, emulating an author’s authentic style at a specific reading level with limited syntax and vocabulary is an impossible task.
What’s more, divorcing a story from its style is like separating humans from atoms. You can try. But if you were to succeed, you would create something entirely different.
Take, for example, the end of “The Great Gatsby,” a novel commonly read in high school English classrooms:
“Gatsby believed in the green light, the orgastic future that year by year recedes before us. It eluded us then, but that’s no matter — tomorrow we will run faster, stretch out our arms farther…. And one fine morning —
“So we beat on, boats against the current, borne back ceaselessly into the past.”
Now, compare that with an adapted version I generated using ChatGPT:
“Gatsby believed in the green light, a symbol of the bright future he dreamed of. Each year, that future seemed to slip further away. It was out of reach then, but that didn’t stop him — tomorrow he would try harder, stretch further … and maybe one day, he would achieve it. So, we keep moving forward, like boats trying to move against the current, always being pulled back into the past.”
The original is elegant, complex. Its arrangement of words and syntax, from the expanded clauses to the ellipsis and dashes, embodies Gatsby’s yearning for the past, as well as our own faith in, and pursuit of, illusory dreams. The message is tragic, but its tone is hopeful, leaving us to wonder: Are we foolish for beating on, or is the honor in the attempt?
The AI adaptation, by contrast, is rigid; mechanical. Its lack of complexity and flattened tone leave us not wondering but resigned. Plus, the phrase, “we keep moving forward” implies progress — advancement. Is that not the opposite of Fitzgerald’s message?
The two versions are distinct. But even if they weren’t — even if the AI’s message were identical to Fitzgerald’s — it wouldn’t matter.
What makes Fitzgerald’s writing — all writing — meaningful is that it is crafted by a person, a breathing, feeling person. Books are, as E.B. White observed, “the Self escaping into the open,” an attempt to articulate the vast mysteries and depth of experience. There is wisdom in human-crafted words, and it is hard-earned.
We mustn’t overlook this wisdom. It’s why we read. It’s what we’ll lose. And it’s something AI will never provide.
Magic School AI includes the first chapter of The Great Gatsby as the exemplar activity within their Text Leveler tool. You can rewrite the text at a first-grade reading level. Many of the high school teachers were shocked that the tech could do this and wanted to know why an edtech company like Magic School would use a text like Gatsby as an example. I think it's clear they don't care what teachers think. Screenshot included in a Google Doc.https://docs.google.com/document/d/19b8_GU9uZOLv6sFNLFSqsAkSReaKb6RbD_v_gUHeSCI/edit?usp=sharing
To be fair, it’s just a new way of producing what has been done for decades. Abridged publications exist for many, many classics. Teachers have spent hours adapting extracts from complex texts with simplified sentence structures and vocabulary to enable pupils with learning needs and disabilities to at least begin to be able to access the texts. Side-by-side modern & original Shakespeare is a necessity in the classroom. It’s not one OR the other, it’s always both. We use adapted texts to help pupils into the originals.
It’s like you or I tackling Shakespeare, Middle English or Anglo Saxon text - we do not understand them immediately and need some help to get in. Then we can appreciate the beauty etc.
If AI is saving hours upon hours of teachers adapting and “translating” complex language for their pupils so that they can then get on to appreciate the original, that’s a good thing, no?
The days of halcyon immersive reading amongst the tulips just doesn’t exist any more. Humans, not just young people, are increasingly screen-addicted to the endless scroll on demand. We can make reading more attractive by making it accessible, but perhaps more importantly by divorcing it from grades and ultilitarian education systems.