The Important Work is a space for writing instructors at all levels—high school, college, and beyond—to share reflections about teaching writing in the era of generative AI.
This week’s post is by Aleksandra Kasztalska, who works as a Senior Lecturer in the CAS Writing Program at Boston University. Her current research and pedagogical projects examine the role of AI in teaching writing and supporting multilingual learners.
If you’re interested in sharing a reflection for The Important Work, you can find all the information here. —Jane Rosenzweig
Research is hard. Having taught many college courses in research and writing, I’ve seen a lot of undergraduates struggle with narrowing down their topics, finding relevant sources, and reading scholarly articles. And as a teacher, I’ve struggled with getting students excited about research, which can feel overwhelming and frustrating to undergraduates. After all, so much of what we ask our students to do requires deep familiarity with academia and the scientific process—the “hidden curriculum” that takes years of practice and active participation to understand.
This past spring semester, I had the opportunity to see how AI can support novice researchers and reduce barriers that undergraduates face in entering scholarly conversations.
I took part in the second iteration of an AI-intensive writing pilot at Boston University, which my colleague and The Important Work contributor Chris McVey had discussed in another post. In my classes, I asked students to explore and reflect on the role that AI plays in their writing and research. I also introduced several AI tools and, with the help of my AI Affiliate (an undergraduate teaching assistant versed in AI), created guides and activities that encouraged students to test AI’s capabilities.
We still struggled at times, but I think that AI made the research process more accessible and helped level the playing field for some of my students.
Let’s take finding sources as an example. In the past, my students were more or less limited to finding scholarly sources through our university library’s search engine, which can be challenging to use for new researchers. To find relevant sources, they had to formulate precise keywords and apply appropriate filters—a surprisingly complex task. As a result, many felt overwhelmed or needed a lot of extra help.
This semester, I introduced my students to Perplexity and Consensus as a supplement to the library search. Like ChatGPT, Perplexity utilizes a simple conversational interface, but it’s designed for research and generates more accurate, factual information. While Perplexity offers a broad introduction to a topic, Consensus helps students understand the scholarly conversation around a given topic. In the past, many of my students had a really hard time finding a research gap or an interesting point of disagreement between scholars. This process would also take them a lot of time, which we simply don’t have in a 16-week semester. But with Consensus, students could generate a broad overview of a scholarly conversation within minutes. They could even get a “Consensus Meter” report that analyzes expert views and shows their relative popularity.
Perplexity and Consensus made academic research less overwhelming and more interesting to my students, but these tools come with real limitations and downsides. For instance, the Consensus Meter offers a very simplistic way of thinking about research conversations and can create a false sense of certainty. And Perplexity, while more reliable than ChatGPT, can produce research reports that are too broad in scope or include sources that are only tangentially relevant to the research question. As a result, a researcher who uses AI needs to not only carefully verify all AI output but also take an active role in the research process. In my classes, I emphasized the importance of the researcher’s agency and asked my students to center their research questions and their own voices at each stage of the research process. Just as we have to maintain a healthy degree of skepticism when reading and evaluating human-created sources, we have to actively question and evaluate AI-generated content.
To develop critical thinking about AI, my students and I read and talked about the dangers and pitfalls of AI. We concluded that, while AI can be a great tool that makes research easier and more fun, there’s also value in tackling a difficult task on your own. As we learned from Dr. Winerö’s TED Talk, the struggle is often a key step in the learning process. To encourage this productive kind of struggle, I generally asked students to first complete a task on their own, then identify a specific issue they needed help with, and finally consider how AI might be able to assist them, if at all. For example, after my students drafted the background section for their research papers, some had a hard time identifying terms or concepts that may be unfamiliar to audiences from different backgrounds. We found that tailored ChatGPT prompts like these can help identify and address these kinds of issues.
Perhaps more controversially, I also allowed students to use AI-generated text in their papers. I had two main reasons for doing so. First, I knew that many of my students were already using AI to generate text in their classes and that they were getting little to no guidance from their other professors. I therefore felt that it was my responsibility to guide them and help them use these tools intentionally and ethically, and in ways that support their learning as much as possible. Second, I was just plain curious to see how my students would use AI. How would our class readings and discussions inform their use of AI? How would they apply the prompts and strategies in their paper? Which rhetorical moves would they delegate to AI, and where would they choose to keep their own voice? As a teacher, I wanted to understand how my class shaped my students’ decisions and what role AI actually played in their writing and research process
To this end, I allowed my students to use AI-generated writing for up to 30% of their drafts and to clearly highlight AI writing using blue font. This gave me a good deal of insight into my students’ use of AI. For example, I found that students rarely used AI for more than 10% of their writing or generated entire paragraphs. Instead, most used AI strategically: to write a concluding sentence for a paragraph, to summarize a complex study, or to transition between paragraphs. Many also used AI to get feedback on their writing or suggest ways to reduce their word count. One student, for example, would paste individual paragraphs into ChatGPT, ask for suggestions on how the paragraph could be shortened, then evaluate these suggestions and implement only those he thought were appropriate. As an instructor, I was happy to see my students use AI intentionally and critically, and glad that they felt they could share their observations with me.
In their reflections near the end of the semester, a lot of my students referred to AI as a “tutor,” “private peer review assistant,” and “collaborator.” They valued AI’s ability to help them at various stages of their research and writing process, but they were also wary of giving up too much control or agency. By learning about and experimenting with different AI tools, they seemed to have gained a good understanding of what AI can and can’t do. They also thought more deeply about the role that AI could play in their learning and research process.
Research is hard. Even in the age of AI, research can be frustrating and overwhelming to undergraduate students. But while a certain kind of struggle can be productive, students can feel demotivated when they encounter unreasonable barriers. Through critical discussion and implementation of AI, I believe that we can help our students have a real shot at participating in scholarly conversations. We can level the playing field by reducing the gatekeeping of the hidden curriculum in research.
As someone who teaches academic writing and reading to PhD students at assorted universities here in London, this was an invaluable post. I really hope they offer some workshops at some point to non-BU scholars and teachers interested in this area.
Love this! I am borrowing these ideas to use in my writing courses in the fall.