The Important Work is a space for writing instructors at all levels—high school, college, and beyond—to share reflections about teaching writing in the era of generative AI. We hope to spark conversations and share ideas about the challenges ahead, both through regular posts and through comments on these posts. If you have comments, questions, or just want to start a conversation about this week’s post, please do so in the comments at the end of the post.
This week’s post is by Christopher McVey, a Master Lecturer in the Boston University Writing Program, where he teaches courses on AI Philosophy and Ethics, post-apocalyptic fiction and film, poetry of witness, and topics in literature. He is part of a team studying student practices and beliefs about generative AI and he serves as a member of the Boston University AAC&U Institute on AI and the Curriculum. You can read more of his writing at https://medium.com/@cmcvey
If you’re interested in sharing a reflection for The Important Work, you can find all the information here. —Jane Rosenzweig
When I began teaching writing with generative AI as part of an AI-intensive writing pilot at Boston University, I relished the surprise on student faces when I explained that they could use generative AI throughout the writing process, including in their submitted work (my AI policy is here). I was unprepared, however, for one question posed by a student toward the back of the room, a smirk curling across his face as he asked it: if students could use AI to write, would I use AI to grade?
Though I knew I would not—for me, personal feedback on writing is the most important work I do—I admitted that I had never even considered it as a possibility. I wondered aloud what students would think if I did use AI for feedback or grading, but nobody volunteered an answer.
I interrupted the silence with a lighthearted joke: imagine, I said to them, a world in which students used AI to complete their writing assignments and faculty used it for grading those same assignments. Nobody would have to get out of bed in the morning and we could let AI live surrogate lives for us. Everyone laughed, and that thought experiment began the first of many conversations with my students about how AI compels us to examine how we find meaning in the work that we do.
The pilot began as a partnership between the Boston University Writing Program and the College of General Studies. This first year included three sections of WR152, the second in the Boston University first-year writing sequence, each taught by a different instructor, and three sections of RH 103, which examines rhetorical practices from the ancient world through the Renaissance, all taught by the same instructor. Our goal was to experiment with intentional Generative AI (GAI) integration in writing curriculum and to identify best practices that could inform program guidelines and pedagogies. We agreed to similar AI policies, allowing students to use generative AI for up to 50% of their submitted work. We also benefited from funding that allowed us to reimburse ChatGPT Plus licenses for all participating faculty and students, as well as to provide each section with an AI affiliate—an undergraduate with experience in either using generative AI or in writing tutoring. Affiliates usually attended one class each week, supporting the instructors to understand how best to integrate GAI in assignments and classes, and to support students in using GAI for their writing. The pilot continues this year with a new roster of instructors, though I am still teaching AI-intensive writing classes.
My colleagues and I have learned a lot from the pilot—more than could be covered in a single post. Lately, however, this teaching has me thinking about efficiency and the increased productivity that AI promises to bring for us all. For my students, the ability to use generative AI helped them with brainstorming ideas, developing research questions, finding sources, and even doing a bit of writing. But this hasn’t always made the process more efficient; collaborative writing with AI is neither easy nor straightforward, and doing so effectively is a skill that takes practice.
Moreover, the vast majority of students often chose to use their own voice when writing because they felt proud of their work and did not want AI to do it for them–at least, not all of it. Perhaps this should not have surprised me. Students want their classes to be meaningful to them, and letting AI do too much of the writing, they felt, was a kind of impoverishment.
I am grateful for the wisdom they displayed in those moments. The context of our classes—the grades, the credits, and the unfathomable cost of higher education—disciplines and punishes students into being as efficient as possible in making progress toward a credentialed degree. Who can blame the students who regard their education as a transactional exchange in which—to reference Margaret Atwood’s dystopian novel Oryx and Crake—students see themselves as customers and faculty as service providers? In that novel, the protagonist attends the fictional Martha Graham Academy, a dilapidated liberal arts school without an endowment and desperate for tuition dollars to keep the lights on. As a result, the university decides to rebrand itself. Its original motto is taken from Hippocrates: ars longa, vita brevis–“art is long, life is short”—but the school changes it to “Our students graduate with employable skills.” Writing classes are replaced with “Applied Rhetoric,” a euphemism for sophistry and marketing. Yet who shall cast the first stone? Who among us is not guilty of justifying the importance of a writing class by underscoring how often employers value talent that can harness the critical thinking and communication skills that our classes emphasize? As a profession, we have opined endlessly about students using AI to avoid learning or take shortcuts, but we have not sufficiently examined the way that the modern university incentivizes them to value exactly those choices.
How do we make space for inefficiency, and can writing classes—including writing with AI—serve as an opportunity to value that which does not come easily or quickly? Just because AI tools are marketed in terms of their efficiency doesn’t mean we necessarily have to use them in those ways. There were many instances in my class in which exchanges with ChatGPT helped students reconsider their position on an issue or prodded them to be curious about a research question they had not initially considered. Students tested their theses by recruiting AI as a skeptical interlocutor, compared scholarly sources identified by AI against those found through traditional search methods in the library database, and practiced revising by using AI to identify their most confusing sentences (some additional AI activities are included in these AI Mini-Games, created by my AI-affiliate, Neeza Singh). Rather than speed things up, AI gave us a new way to slow things down or try a new approach.
I have also been surprised at how much space in the syllabus is necessary to make room for working with generative AI. Learning how to prompt AI effectively and leverage it as a collaborator takes experimentation and room for failure. I would often find myself scratching out entire lessons so we could spend more time exploring AI capabilities and talking together about our experiences with them, both within and outside the bounds of our class. Students are desperate for these conversations, but they are understandably hesitant to talk about them openly lest they be judged either by their professors or by their peers.
Generative AI offers us an opportunity to think about how we spend the time we have. It requires us to think more critically about whether we should find more ways to value that which is inefficient or does not come so easily. We should not assume that students will figure out the best ways to work with AI on their own, and we must model for them what it means to be a lifelong learner by remaining open-minded ourselves. The weight of the semester and the timetable of the academic calendar is often at odds with this kind of experimentation.
Finding ways to value inefficiency is also, I admit, exhausting. After a semester of learning to write with AI, we were all ready for a break. On the last day of class, I asked students what they most looked forward to doing in the summer. One young woman looked up from her desk and smiled: “Finally having time to read a book.”
Thanks for this nuanced post. Speaking as a secondary (high-school) teacher, it is much more problematic to make accommodations to AI, I think. btw, should you be interested, Newkirk gets it right on 'slowness' and 'efficiency' in my opinion: https://www.juliangirdham.com/blog/thomas-newkirks-the-art-of-slow-reading
This is fascinating. As someone who teaches writing at the Higher Ed and corporate level I'm really grappling with how to incorporate AI into writing, especially academic writing. You seem to offer workshops aimed at BU faculty. Do you have or will you have any online courses for a wider audience? Thanks