How Is A.I. Used in Wesleyan’s Classrooms? A Look Into Those Navigating the Shift
Consider the range of responses that could be evoked by asking a simple question: “Do you use Chat?”
The discourse on artificial intelligence (AI) has become increasingly prevalent for many higher education institutions, including the University. As the accessibility of tools such as ChatGPT, Gemini, and Claude expands, education is at the forefront of impacted fields.
Here in Middletown, amidst the confusing policies and ethical dilemmas surrounding AI’s impact, many students, faculty, and administrators are navigating how Wesleyan as a liberal arts institution, can move forward in an era ever-defined by AI.
In an interview with The Argus, University President Michael Roth ’78 affirmed the University’s obligation to confront the AI question.
“There are at least some people who think that we can run a university without using large language models,” Roth said. “I think they’re mistaken. I think that large language models are already ubiquitous among students. It’s just a question of how people use them, not whether they use them.”
The University’s stance on students’ use of AI remains unsettled. While the Honor Code states that the use of AI-generated “words, ideas, images, data, or research” without attribution qualifies as plagiarism, the University acknowledges that its official Generative AI Usage Policy functions as a guideline rather than a finalized protocol.
“This policy is not exhaustive, but represents work towards a thorough and comprehensive policy,” the University’s Information Technology Services website reads.
So, how should the University incorporate AI into its curricula? Or, should the University forbid the technology altogether?
Conversations about AI are commonplace on campus, and the administration is attempting to get a sample of the diverse opinions within the community. Throughout the year, the Office of Academic Affairs has organized strategic conversations about the future of the University’s curricula in preparation for Wesleyan’s bicentennial in 2031. Dean of the Arts and Humanities, Deputy Provost, and Professor of Music Roger Mathew Grant spoke with The Argus about the role of these dialogues in shaping the future of the University’s curricula.
“We’re talking to students, faculty, staff, alumni, getting people together in different groups to ask about their experience of the curriculum and about their experience of various aspects of what Academic Affairs does for the University,” Grant said.
The last time Academic Affairs employed this strategy to inform its approach to a revolutionary technology was in 1997, in response to the arrival of the internet. According to Grant, AI is now taking precedence in recent conversations.
Although Grant has heard a range of opinions on AI, he notes how these conversations are overall productive. Compared to larger universities, Wesleyan’s smaller size offers more opportunities for students and faculty to convene across disciplines.
“I see a lot of consensus among faculty and students,” Grant said. “What’s interesting is that we don’t have very many opportunities for faculty and students to come together in a kind of unstructured way outside of the classroom, which is highly structured, hierarchical, and evaluative…. I think if we had more of those [unstructured] spaces, we would start to discover how much consensus there is, and I think that’s really cool.”
At the University, instructors may choose whether they want AI incorporated in coursework, and students are expected to consult their course syllabus for specific guidelines. Assistant Professor of the Practice in Video and Audio Production Pedro Bermudez has chosen to experiment with AI in the classroom.
“In the classes I teach, we use AI to improve planning and production workflows,” Bermudez wrote in an email to The Argus. “We’re also using AI in post-production alongside conventional techniques for visual effects, audio production, and sound design.”
According to Bermudez, AI platforms have been useful for digital animation and world-building in his courses, where tedious editing processes can now be sped up and made more efficient. Bermudez is also mindful of students’ opinions on AI implementation.
“I now ask the students at the start of my classes every semester to share their feelings on working with AI,” Bermudez wrote. “I think many students have a healthy skepticism and questions about the learning outcomes of projects that have generative AI components. And this ambivalence is complicated by their genuine interest in learning more about the tools.”
Looking to the future, Bermudez acknowledges that the advancement of AI will continue to impact higher education, advancing the necessity of addressing the pitfalls of AI and working together to build solutions.
“As A.I. continues to rapidly develop, we need to integrate the ethical use of these tools into our curriculum,” Bermudez wrote. “I think this effort can involve working groups of faculty and students tasked with drafting best practices and policies to ensure transparency regarding the AI tools: specifically, how they are used, the information repositories they access, the differences between various AI models, their potential for error, and the development of workflows that prioritize and preserve the students’ unique creative and intellectual contributions.”
Jane Kakalec ’26 is one of many students whose professors permit some forms of AI in their assignments.
“In the data analysis minor and Econ major, AI is very present and not frowned upon as it is in other areas,” Kakalec wrote in a message to The Argus. “In many of my courses, AI is allowed for research planning, code correction, and idea generation.”
In Kakalec’s opinion, the lack of a general alignment concerning AI across academic areas confuses students about what is expected of them.
“I do not think students and faculty are aligned at all,” Kakalec wrote. “I think different subject areas produce vastly different views on AI. In the humanities, people seem super anti-AI and shocked when I say I use it for classes. The more quantitative fields tend to recognize AI as a useful tool, and professors are okay with it.”
In addition to quantitative fields, some humanities courses offered at the University utilize AI.
Last fall, Cullen McCleary ’29 took Roth’s “Virtue and Vice in History, Literature, and Philosophy” (COL228) where AI was encouraged and assigned.
“Every week, our homework was to discuss both the reading that week as well as the virtue that we practiced that week with an AI chatbot, which was created specifically for our class,” McCleary wrote in a message to The Argus. “We would set a timer for 10 minutes and either tell the chatbot what virtue we practiced or pick a passage/extract from the reading to discuss. It would then respond with questions and guide you through a conversation about the topic.”
McCleary added that he enjoyed his conversations with the AI chatbot.
“I thought having a conversation with something about our reading/virtue was more stimulating and engaging than just providing a written response, and also was better at challenging your ideas,” McCleary wrote.
While the use of AI may be encouraged in some courses, a strong sentiment still exists against the inclusion of AI at the University. Assistant Professor of Art History Joseph Ackley stands firm in his commitment against AI in his classes.
“For my classes, I prohibit the use of AI, including Grammarly,” Ackley wrote in an email to The Argus. “For writing papers, it is essential that students grapple with the blank page, that they work through the mess of ideas running through their minds, and that they practice patience and focus.”
Ackley mentioned that when he asked for students’ thoughts on his no-AI policies, he received overwhelming support.
“At its worst, AI constitutes an existential threat to learning,” Ackley wrote. “Learning loss and cognitive offloading—in other words, not building one’s intellectual muscle because AI is doing the work for you—are present and acute dangers. We have already seen what the smartphone era has done to our attention spans and decreased ability to focus. I worry that an uncritical embrace of AI technologies will simply exacerbate these alarming trends.”
Despite these cynical thoughts, Ackley is still hopeful that the University and higher education as a whole will navigate AI use accordingly.
“What makes me hopeful about higher education meeting this moment is what I have witnessed at Wesleyan,” Ackley wrote. “Everyone, from the senior administration to the most junior faculty and staff, has been discussing AI nonstop. We are actively learning, and we are actively talking to each other, and one of the silver linings of the AI wave is that it has forced all of us to think deeply about what we do in the classroom and why it is valuable.”
Assistant Director of Academic Writing and Associate Professor of the Practice in Academic Writing Lauren Silber is another faculty member addressing the dilemma of AI in the classroom. From Silber’s perspective, there is no clear student perspective or alignment surrounding AI.
In collaboration with the Wesleyan Student Assembly (WSA) and the Writer’s Room, Silber is asking for students’ opinions on AI through an anonymous survey.
“I don’t think we have any sense of perspectives on AI at Wesleyan, let alone any sense of alignment,” Silber wrote in an email to The Argus. “That’s why students in the Writers Room Lab and I are working with the WSA to collect data on AI use, perceptions, and culture at Wesleyan. There is so much conversation about AI, but we still have no idea what’s actually happening on our campus.”
Silber also looks to engage students with AI models through her first-year seminar.
“This semester, I’m teaching a first-year seminar called ‘Writing in the Age of GenAI [WRCT155F],’” Silber wrote. “In this course, students focus on figuring out what writing is by exploring their literacy practices and journeys before tackling generative AI. We spend time discussing how machines produce text and compare machine processes to the ways that humans produce text.”
To Silber, AI is a force that will ultimately transform the learning experience for students in higher education. She maintains that adjusted teaching practices may be necessary.
“Now that there are machines that can write A- papers and decipher a complex 200-page reading in a manner of minutes, faculty must be thoughtful and explicit about how we design and sequence our courses,” Silber wrote. “Figuring out what the goals of a Wesleyan education are—or even the goals of a history major versus a comp sci major versus a theater major—will allow us to make discipline-specific, field-based, expertise-driven decisions about generative AI that will make sure that students get the most up-to-date educational experience they can.”
Alexandra Simon ’27 recently conducted research on various universities’ engagement with AI in accordance with a project for the College of Design and Engineering Studies (CoDES).
According to Simon, the University wants to bring in a new faculty member to teach AI, and the CoDES Advisory Board believes the new position should fall in their department. Ultimately, she found that Wesleyan may be falling behind other higher education institutions in the process of AI incorporation.
“What I’ve found is that many [institutions] have endorsed AI and are introducing new fields and courses available to students to utilize AI as an essential tool, and actually offer post-doctorate positions for AI research and development,” Simon wrote in a message to The Argus. “I looked into other similar schools (NESCACs, ivies, small liberal arts) and researched what programs they have, types of classes/research positions they offer, if they have an AI center/hub, or grants for AI research. Compared to most schools, Wesleyan is actually pretty low on their engagement with AI and the resources available for students/faculty.”
Simon is wary of students’ use of AI, but ultimately believes addressing the presence of AI is necessary.
“Ignoring AI is not the answer,” Simon wrote. “It has already become an integral part of everyday life in most corporate environments, and teaching students how to utilize these platforms is generally helpful, but I feel that the explosion of AI has left higher education scrambling to figure out what to do and where to enforce guidelines.”
Maggie Smith can be reached at mssmith@wesleyan.edu.

Leave a Reply