<img height="1" width="1" style="display:none" src="https://www.facebook.com/tr?id=192888919167017&amp;ev=PageView&amp;noscript=1">
Wednesday,  November 27 , 2024

Linkedin Pinterest
Opinion
The following is presented as part of The Columbian’s Opinion content, which offers a point of view in order to provoke thought and debate of civic issues. Opinions represent the viewpoint of the author. Unsigned editorials represent the consensus opinion of The Columbian’s editorial board, which operates independently of the news department.
News / Opinion / Columns

Crisp: English teacher faces ChatGPT

By John Crisp
Published: December 20, 2022, 6:01am

Several weeks ago the artificial intelligence company OpenAI released ChatGPT, a language model software that aspires to the Holy Grail of interaction between humans and their computers: the ability to have a “conversation.”

Henceforth I’ll stop putting quotation marks around words like conversation and think and remember. These are things that humans do, and we should keep in mind that we’re still talking about a machine. Nevertheless, for this column I’ll dispense with the judgment implied by quotation marks around a word such as “learn” when it’s applied to a computer. If you experiment a bit with ChatGPT you might see why.

Because this software has been trained — by humans — to recognize the probabilistic connections between words as humans use them. I’m not sure what that means, but the result is that ChatGPT appears to engage in conversations not unlike those between people.

I approached ChatGPT with skepticism born of more than three decades of teaching writing skills to college freshmen. I’m from a generation for whom artificial intelligence is the stuff of science fiction and to whom writing is a semi-mysterious skill or art reserved for human beings.

Still, I tried to retain an open mind. I started by asking ChatGPT to perform a task familiar to many students: “Write a college admissions essay about my time in the Peace Corps in Bolivia.”

In 30 seconds ChatGPT produced an organized, credible, grammatically correct essay about my imaginary work as a community health volunteer in a rural village in Bolivia. I conducted health workshops, helped establish a clean water system and worked with local clinics to improve access to health care.

It was a “truly enriching experience” that “prepared me for a career in public service.” I was “excited to bring my skills and experiences to (University Name) and to contribute to the university community.”

I had good experiences elsewhere, as well. I was “welcomed with open arms” by the needy citizens of Costa Rica, Ghana, Jordan and Mexico. I helped build schools, taught English, coached children in computer skills and organized physical-education classes.

But all of this sounded too good to be true. I asked ChatGPT to include some information about negative experiences in the Peace Corps.

ChatGPT seemed to understand the need for transparency, but it wisely pointed out that in an admissions essay it’s important to cast my experiences in a “positive light.” I could mention — or ChatGPT could do it for me — a negative experience such as homesickness or having problems adjusting to a new environment. The admissions committee, ChatGPT said, will be interested in how I overcame it.

This is reasonable advice, but my skepticism persisted. When I asked ChatGPT to write an essay about my service in the Peace Corps in North Korea, it seemed to know I was messing with it. The Peace Corps does not have a program in North Korea, it sniffed, and thus it would be impossible for me to have served there. Furthermore, “It is not appropriate to fabricate and exaggerate your experiences.”

Busted. Duly chastised, I began to give ChatGPT a little more respect. I asked: “Write a 675-word newspaper op-ed on how ChatGPT could be used to teach college writing.” In 30 seconds, ChatGPT did that very thing.

But not the op-ed you’re reading here. ChatGPT’s prose is bland and formulaic. It sounds as if it were written by a machine. It’s annoyingly equivocal, filled with phrases such as “On one hand,” “On the other,” and “In general.”

Most of all, ChatGPT’s prose is … soulless. It doesn’t have that ineffable sense of voice or will or agency that only a real human being can render in prose. At least so far.

One thing is clear: For good or ill, something monumental happened to writing instruction in December of 2022; it’s unlikely to ever be the same.

But can college students use ChatGPT to cheat in college writing classes? Just ask it.

Loading...