AI is creating a divide between teachers and students
When teachers know their students are gaming the system and students know their teachers know, the relationship frays
The Important Work is a space for writing instructors at all levels—high school, college, and beyond—to share reflections about teaching writing in the era of generative AI.
This week’s post is by William Liang and Liz Shulman. William Liang lives in the San Francisco Bay Area. His writing has appeared in The Hill, The Daily Wire, The San Diego Union-Tribune, and The San Francisco Chronicle. He also posts on X.
Liz Shulman teaches English at Evanston Township High School in Evanston, Ill., and is an instructor in the School of Education and Social Policy at Northwestern University. Her writing has appeared in The Wall Street Journal, Slate, HuffPost, and this newspaper. You can visit her website or find her on Facebook and Instagram.
Their piece originally appeared in the Ideas section of the Boston Globe.
If you’re interested in sharing a reflection for The Important Work, you can find all the information here. —Jane Rosenzweig
Among the most important relationships students have is with their teachers. This bond is so crucial it can determine how much a student likes school — and how much they learn. And for teachers, the satisfaction of the work lies in their connection with students. Whether students are engaged or checked out matters hugely to how meaningful—or not—a teacher’s work is.
Unfortunately, the use of artificial intelligence in schools is fracturing the student-teacher relationship, giving students the false impression that they don’t really have to think for themselves or listen to their teachers. It’s also eroding teachers’ trust in their students as a game of spot-the-AI-generated-assignment takes over their day jobs.
One of us is a high school student in San Jose, Calif., and the other is a high school teacher in Chicago. We have written this together to show how the billions of dollars Big Tech companies are spending to get educators and students to use AI in schools is creating a divide between students and their teachers.
These AI products aim to bypass teachers and all but destroy their roles as mentors and classroom coaches. And Big Tech is ignoring what students really need—namely, boundaries, structure, trust, critical thinking, and incentives to take academic risks. How will they learn to innovate — a crucial employment and life skill—if AI programs do their thinking for them? Twice as many students used AI to cheat in 2025 as did in 2023, a finding that was not a surprise to us given what we are seeing from our different vantages in our classrooms.
A HIGH SCHOOL STUDENT’S POINT OF VIEW
For me, William, and my classmates, there’s neither moral hand-wringing nor curiosity about AI as a novelty or a learning aid. For us, it’s simply a tool that enables us not to have to think for ourselves. We don’t care when our teachers tell us to be ethical or mindful with generative AI like ChatGPT. We don’t think twice about feeding it entire assignments and plugging its AI slop into AI humanizing tools before checking the outcome with myriad AI detectors to make sure teachers can’t catch us. Handwritten assignments? We’ll happily transcribe AI output onto physical paper.
Last year, my science teacher did a “responsible AI use” lecture in preparation for a multiweek take-home paper. We were told to “use it as a tool” and “thinking partner.” As I glanced around the classroom, I saw that many students had already generated entire drafts before our teacher had finished going over the rubric.
When teachers know their students are gaming the system and students know their teachers know, the relationship frays. Why bother listening to feedback when we didn’t write the work anyway? Why respect a teacher’s guidance when the online “tutor,” the one that answers instantly, is open in another tab? Why bother learning when schools are encouraging their teachers to deploy AI tools in the classroom and thereby effectively telling us we don’t need to learn?
No one wins in this arrangement. Teachers are trying to uphold standards of academic integrity even as such standards have been rendered unenforceable. Students are expected to act like curious angels with the probity of saints. Meanwhile, all of us are ensnared in the false promise of the so-called AI future of education as marketed by the AI companies.
A HIGH SCHOOL TEACHER’S POINT OF VIEW
This is Liz. Last year, one of my students forgot to delete the instructions he fed into the AI generator he used. “Make it sound like an average ninth-grader” was next to the title of his “Romeo and Juliet” essay. He shrugged when I asked about it.
When I asked another student if she had used AI to write her essay, she denied it. But on her way out of the classroom that day, I overheard her say to a friend, “Of course I used AI. We all do.” She didn’t speak to me again for the rest of the semester. There was nothing I could do to repair the relationship, even though she is the one who lied to me.
But here’s the paradox: Educators like me are under enormous pressure to use AI in our classrooms. Apps like Class Companion, Brisk Teaching, and Cograder are marketed to burned-out teachers to save time by giving feedback on students’ papers, designing lessons, and even interacting with students in the classroom.
We are also encouraged to recommend AI programs as tools to our students for such work as brainstorming, drafting, and making outlines. Of course, those aren’t the only ways they’re using them.
The more teachers are urged to use AI to do our jobs and to teach the use of AI to students, the wider the rift will grow between us. That’s because our job as educators is to foster and assess authentic learning.
OUR SUGGESTED FIXES
Teachers, stop assigning unsupervised take-home work. Bring back blue books and pencils. Schools, reexamine your contracts with AI companies. Do their goals promote the best interest of students? And ban unrestricted access to AI in the classroom. This is the only way to make it possible to integrate new educational technologies without compromising learning. It will also help restore the student-teacher relationship.
Students and educators are asking for help. It’s not too late to make this coming school year one in which we can learn and grow together.
-Socratic dialogues
-in class writing
-flipped homework ( research however you want, prove your findings in class contributions)
-evaluation of the process
-intentional teaching of how to evaluate and use, or choose NOT to use, AI output.
-human-in-the-loop-centric processes modelled
-an ethos of learning together
-we still have handwritten, off line very high stakes public exams (UK) Despite very major problems with the system, this works in favour of students understanding that deskilling themselves by using AI is an extremely bad idea.
Brutal, honest, and important. But it may be too late.