The Important Work is a space for writing instructors at all levels—high school, college, and beyond—to share reflections about teaching writing in the era of generative AI.
This week’s post is by Jenny Lederer and Jennifer Trainor. Jenny Lederer is Professor of Linguistics and Linguistics Program Coordinator at San Francisco State University. Jennifer Trainor is Professor of English and Faculty Director at SF State's Center for Equity and Excellence in Teaching and Learning, and is the author of "Seeking Literacy Justice" (with John Holland, forthcoming in Composition Studies) and "The Big Picture" Substack.
If you’re interested in sharing a reflection for The Important Work, you can find all the information here. —Jane Rosenzweig
This past spring, the California State University (CSU) Chancellor’s Office invited faculty to attend an OpenAI webinar titled Writing in the Age of AI: What Faculty Need to Know. The webinar was offered as part of the landmark Open-AI -CSU partnership—a 17 million dollar effort to become an “AI-powered university.”
As writing instructors at San Francisco State University, we viewed the webinar with curiosity and dread. The webinar presenter opened by reviewing scholarship on the connections between writing and thinking, and stressed the educational value of teaching students to write. So far, so good. But then an audience member submitted a question: how can we ensure that students engage in critical thinking via writing, given easy access to ChatGPT?
The presenter struggled for an answer, before settling on the increasingly familiar advice to move writing into the classroom and bring back blue books.
Before we dive into the pros and cons of this advice, we want to pause for a moment on the irony: learning to write, from OpenAI’s point of view, requires avoiding the very technology they are selling. Indeed, this admission on their part echoes what many faculty already feel: we can no longer assign longer-form take-home writing projects; we must water down the curriculum, perhaps even returning to outmoded in-class assessments.
No wonder CSU faculty senates at CSUN and Fresno have issued resolutions condemning the CSU-OpenAI partnership. No wonder many in the field of Writing Studies are refusing AI altogether.
Experts in Writing Studies argue that blue book assessments exacerbate the AI-cheating panic. We agree. But we can also understand why faculty are worried. Student use of GenAI is high and climbing. In our department, nearly all writing teachers have banned GenAI; many are buying notebooks for their students, and planning to flip the classroom—moving key reading and writing activities into the classroom.
Interestingly, these efforts are not just about protecting students’ literacy learning or preventing cheating. They are also a strategy to resist an increasingly tech-mediated, disengaged, checkbox approach to education embraced by cost-conscious administrators, and by students, who juggle jobs and commutes, making attendance and engagement difficult.

This checkbox approach brings us to one of the key selling points offered to participants in the OpenAI webinar: the personalization of GenAI. Throughout the presentation, the presenter leaned on personified discourse to describe student interaction with ChatGPT: use it as a tutor, a helper to enhance creativity, a sounding board to bounce ideas off. (Mind you none of his case studies were focused on college students, but rather well-seasoned writing professionals.) Embedded in this “agent” discourse is the inference that AI tools can replace the functions of well-established institutional support systems like on-campus tutoring centers, disability resource centers, professor office hours, and even the type of support students would typically get from their classmates, roommates, and friend groups. In general, OpenAI frames this as a benefit; in the webinar, the presenter claimed ChatGPT could replace some tutoring services and in-person support from faculty, allowing students to receive real-time help without waiting or feeling intimidated.
The more universities encourage students to replace on-campus human connection with AI “agents”, the less likely they will be to meet their social and professional network of the future. Social interaction in the classroom, in the library, with a peer tutor, with the student down the hall is, arguably, the essential ingredient to a successful college experience.
Worryingly, OpenAI’s personalization message has partially landed with students. Students have told us they routinely upload assignments into ChatGPT in order to understand the assignment, because students often assume teachers won’t have time to answer their emails (though we, of course, do answer them). Others have trained the bot on their own writing, asking it to replicate their voice—and even their common grammatical mistakes—so the output is more believable, more “them.”
While we disagree with OpenAI’s call for a return to blue books as a way to police cheating, we are still considering moving writing back to the classroom – not to surveil students but to prevent the very depersonalization—and corresponding isolation -- that OpenAI celebrates with these tools.
On our campus, serving first-generation and working-class students, we’ve seen a dramatic shift, accelerated during Covid, toward online instruction and LMS-heavy instructional “delivery” models. We are encouraged to make our classes available in multiple modalities to accommodate students whose work schedules, mental and physical health struggles, and learning differences make attendance difficult. These efforts to expand access—which we support unequivocally—have too often taken the easy (high-tech) road, leading unintentionally to decreased attendance and a breakdown in classroom and campus community. Technology solutions to structural problems—where messy, unpredictable, difficult classroom learning communities are reduced to clicking checkboxes or submitting files on Canvas as more and more classes on campus are offered asynchronously—is negatively reshaping students’ expectations and experiences. As Wardle notes, structural issues like these create a fertile ground for GenAI tools to masquerade as solutions.
Given these issues, we propose a third way that addresses the isolation that “personalization” brings and avoids the punitive policing that blue books imply. This fall, we plan to bring some writing back into the classroom not to catch cheaters, but to rebuild learning communities, emphasize the importance of writing to learning and thinking, and resist personalization.
While instructors' return to in class pen and paper is understood as a form of AI policing, we argue it elevates and makes learning explicit, providing space for students to lean into process and de-emphasize product. We plan to purchase class sets of inexpensive notebooks for low-stakes writing: informal notes, reading responses, prewriting, reflection, and yes, drafting. Bringing writing into the classroom does not preclude teaching students about GenAI nor does it mean that teachers can’t teach students how to work with AI. We understand that GenAI tools may, in some cases, provide scaffolding that supports students’ voice and rhetorical agency. In our classes, we plan to introduce students to the limits of GenAI, especially when it comes to linguistic justice and bias, and invite students to do some investigative work of their own around the potential and challenges GenAI poses for their development as writers. But we also see many benefits for in-class writing free from GenAI influence:
Ensures participation: Writing before discussions can give students a foundation of ideas to bring to the conversation; writing to summarize or synthesize discussions can help students generate talking points for their essays;.
Builds critical thinking: Through freewriting, list-making, note-taking, dialogic journaling, and perspective-taking exercises, students engage more deeply with ideas;
Supports reading comprehension: In-class notes on readings help students learn how to annotate, ask questions, and track ideas;
Promotes the writing process: Notebooks allow for ideas-in-process where students’ thinking can evolve over time;
Counters over-reliance on AI: Since students generate content during class, they already have a body of work to draw from when writing at home—reducing writer’s block and the temptation to rely on AI.
Some instructors in our program have wondered if in-class writing is worth the trade-off; it takes time away from other kinds of writing instruction and activities. But decades of writing process scholarship affirms that freewriting, note-taking, and journaling can accumulate into a more reflective writing practice.
We also plan to use in-class writing for higher-stakes assignments. At the end of each unit, students collaboratively generate a list of key terms and concepts related to the Writing About Writing (WAW) concepts we’ve covered (e.g., voice and identity; the rhetorical situation; genre awareness; linguistic justice). Working in groups, they create reflective, open-ended questions using these terms (e.g., How did your understanding of audience change in this unit? or Do you think writers always have to accommodate their audience instead of the audience learning to listen to them? Does freewriting help your final draft, and if so, how?).
During the next few class periods, students choose one piece of their own writing to analyze and respond to the questions, brainstorm answers together, and peer review their work. This kind of in-class assessment has many benefits in general (it fosters metacognitive awareness; it centers student agency; it avoids reifying school genres and white language supremacy). Undertaken collaboratively in class adds the following benefits:
Builds community: Because students write about shared classroom content, and often share their writing in small groups, in-class writing contributes to a culture of reflection and shared growth. It can foster a sorely needed sense of belonging and engagement; By engaging students in community-based in-class assessments, we signal to students: your writing matters; you’re engaged in meaningful learning that is worth the effort and sacrifice that college requires; your community is here to help
Centers process over product: Moving writing into the classroom ensures that students have the time and space to engage in the writing process. Indeed, we believe that one-shot writing happens as often outside of class (when students squeeze writing assignments in before the due date, juggling a myriad of other responsibilities) as it does with traditional blue books. Our approach scaffolds writing time across class sessions, to help students practice writing process steps in community and with support.
OpenAI’s proposal that blue books might save student learning, and their push for personalization are telling admissions: they themselves recognize the potentially corrosive effects on literacy even as they push their solutions on resource-strapped institutions. For us, a reclaiming of the classroom and in-class writing is not meant to resurrect outdated assessments, nor necessarily to ban GenAI, but to push against the isolation that has crept onto our campus as budgets shrink and tech solutions beckon.
Dynamic classrooms won’t remedy all the learning obstacles our students face, but we believe that moving writing back into the classroom—both in informal, low-stakes and higher-stakes opportunities to demonstrate and reflect on learning—can signal to our students the importance of an embodied, social learning experience, and can emphasize to them that we value their presence and voice.
OpenAI’s inability to articulate how ChatGPT can be used to enhance the learning of writing during their own webinar is a clear sign that they are what they have always been—a tech disruptor pushing a product that weakens teacher and peer relationships. Their own webinar reinforced to us that their attempts to enter the educational space are more PR than sound pedagogy. How can a product that replaces thinking simultaneously serve as a tool to learn how to write? Writing instructors know that writing is thinking and thinking is writing. On some level, the webinar revealed, OpenAI knows this too.
Thank you so much for this! I teach writing and for the past few semesters, I have used blue books for one assignment, and it has been a revelation for me and some students in terms of removing the layers of technology and sometimes suspicion that have contributed to feelings of distance between me and my students. Also, students have shared with me that they like the blue books because “the pressure is off me to use ChatGPT since no one can use ChatGPT.” I like the writing that I see in blue books because I feel like I’m learning more about my students. This coming semester I’m expanding in-class writing and I’m giving everyone a notebook to use in class. It’s just one part of my class - we still use other technology, but you both remind us that multi-modal teaching isn’t just about using different digital technologies.
EXCELLENT article obviously written by teachers who know how to teach writing. Hey . . . writing is hard work and students need tons of guidance and practice in discussion settings with their teachers which of course is best accomplished in person.