Chris Grobe, Kristina Reardon, and Lee Spector
Left to right: Chris Grobe, associate professor of English; Kristina Reardon, director of the intensive writing program; Lee Spector, professor of computer science.

“ChatGPT doesn’t go away if we ignore it,” said Kristina Reardon, lecturer in English and director of Amherst’s Intensive Writing Program.

Indeed, the reason she was speaking in Frost Library on Feb. 20 was to help faculty, staff and students grapple with the advent of the language-based OpenAI program, which has attracted more than 100 million users since November 2022. ChatGPT can generate grammatically correct and confident-sounding passages of text on just about any topic—which means it has potentially radical implications for the future of schoolwork and written communication in general.

So the College’s Academic Technology Services and Center for Teaching and Learning (CTL) convened a panel titled “ChatGPT in Education: Boon, Bane and Beyond.” It was facilitated by Jaya Kannan, director of technology for curriculum and research, and Riley Caldwell-O'Keefe, director of the CTL.

In considering her changing role as a writing instructor, Reardon said, one of the questions she asks herself is: “What should I push my students to do that ChatGPT cannot?”

Riley Caldwell-O'Keefe and Jaya Kannan

Panel facilitators: (top) Riley Caldwell-O'Keefe, director of the center for teaching and learning, and Jaya Kannan, director of technology for curriculum and research

But her fellow panelist Lee Spector noted that, as time goes on, there will be less and less that this kind of technology cannot do. Spector is the Class of 1993 Professor of Computer Science and director of the Artificial Intelligence in the Liberal Arts initiative (which hosted a similar discussion last semester about an AI visual art generator called DALL-E). Right now, he said, ChatGPT is roughly analogous to a smartphone’s auto-complete function: it can imitate “the kind of stuff that people say,” but the stuff isn’t necessarily factually accurate, and the software can’t evaluate its meaning. However, the state of the art is advancing rapidly—developers could soon hybridize this kind of program with one that can use reasoning and logic.

Associate Professor of English Christopher Grobe cautioned against applying our intuitions about human intelligence to try to understand or predict what AI programs might do. He believes it’s important to encourage students “to be curious about these technologies” and to “assess, express and transfer values” from various academic disciplines into conversations with and about the tech sector. We should remain aware that the public’s levels of access to programs like ChatGPT depend upon the business decisions and financial interests of tech companies.

Throughout their conversation, the panelists reflected on points raised in several different sources. These included Grobe’s recent Chronicle of Higher Education piece “Why I’m Not Scared of ChatGPT”—which, Grobe said, focuses on writing as a process rather than as a product through which student performance is assessed and ranked—and science fiction writer Ted Chiang’s New Yorker essay “ChatGPT is a Blurry JPEG of the Web.”

The panel also cited Bruce Ballenger’s classic essay “The Importance of Writing Badly” and the work of 20th-century mathematician and computer scientist Alan Turing. One audience member even quoted Robert Frost’s poem “The Most of It”: “… what it wants / Is not its own love back in copy speech, / But counter-love, original response.”

The discussion attracted several students and many faculty and staff from a wide range of offices and departments—the Writing Center, Keefe Science Library, geology, English, physics, computer science. Amherst’s chief communications officer and chief information officer were in attendance, as was President Michael A. Elliott.

Comments from the audience brought up complex questions: Why is ChatGPT so much more popular and controversial than other kinds of chatbots? When people use it to help write an article, should the program be credited as a co-author? What will happen when new AI programs are trained on the writings of older AI programs? If a program homogenizes a multitude of writerly perspectives and styles into one voice, will students learn to emulate that voice instead of developing their own? Will AI render some skills unnecessary for students to learn?

In Spector’s view, the evolution of artificial intelligence “changes a lot of fundamental questions in many disciplines.” But his main goals for his students won’t change: he still wants them to experiment and to learn to think clearly and critically.

“The meaningful part of education,” he said, “is about the transformation of you and your capabilities.”