AI in the Writing Classroom: Professor, Beware By Michael Dowding

https://www.nationalreview.com/2024/08/ai-in-the-writing-classroom-professor-beware/

Generative artificial intelligence has no role in the walled garden of teaching a student how to write.

For nearly three decades, I’ve had the privilege of teaching media writing to generations of undergraduate and graduate students at Boston University’s College of Communication, helping them week after week to steadily acquire the skills they need to embark on successful careers. However, two years after the arrival and wider use of generative AI, it’s inarguably clear that, at every level of education, these tools represent nothing short of an existential threat to the writing classroom — undercutting the very way we formulate, develop, and express our intelligence.

Although I needn’t rehearse all of the many statistics describing the obvious decline of writing and literacy in our society, the temptation is too great. Last fall, Joseph Pisani grimly noted in the Wall Street Journal, “the average score on the ACT dropped to a new 30-year low, indicating fewer high-school seniors are ready for college.” That lamentable decline aligns with other research showing the number of teenagers who read for pleasure dropping steadily over the past 40 years, while the number of teens who rarely or never read for fun climbed from 8 to 29 percent.

Since writing instructors can confirm that the best way to learn writing is to read voraciously, it’s clear that we instructors already face a steep climb, competing with distractions ranging from TikTok and Snapchat to Netflix and the latest smartphone games. We already have a generation that refuses or is unable to read. Do we now want them unable to write, as well?

Ask virtually any college writing instructor today and you’ll get some variation of issues about student performance: an inexplicable unwillingness to read (either compulsorily or voluntarily), an inability to properly compose sentences that rise above the most pedestrian of structures (at best), or a crippling overreliance on writing and grammar tools whose scope only increases. Those concerns are now exponentially magnified.

With the arrival of gen AI in late 2022, my colleagues and I have been facing a game-changing inflection point: the increasing inability to accurately assess an individual student’s writing talent — their ability to create, develop, and express original thoughts in clear, compelling, audience-centric ways. Even in the face of clear policies prohibiting the use of gen AI in our classes, we now face an onslaught of prefab essays brazenly churned out in seconds by time-pressed, shortcut-seeking students eager to get their tickets punched before moving down the path to their degrees.

One factor that’s also shaping this dynamic: the rising cost of education. With BU’s list price for four years of tuition, room, board, and fees topping $360,000, there are significant pressures on students to pursue part-time work, internships, and other time-consuming avenues to maximize their returns on that enormous investment. Gen AI is one highly tempting way to become more time-efficient.

My colleagues and I have been grappling — largely unsuccessfully — with every aspect of this threat: prevention, detection, and remediation. Countless times in recent months, fellow instructors have pulled me aside to furtively tell me they’re applying their own version of Justice Potter Stewart’s famous maxim as they read assignment after assignment: “I may not be able to prove it , but I know it when I see it.”

I myself have developed the “moreover rule.” Any student submission that breezily drops in “moreover,” a word that one just doesn’t see with any frequency in the Gen Z lexicon, immediately merits much more careful scrutiny. The problem, of course, is conclusively proving the use of gen AI. After all, no one wants to make unprovable (or false) accusations. So we instructors march on, nurturing ever-growing skepticism about the provenance and authenticity of what we’re reading and marking.

What’s the harm?To be completely clear: I yield to no one in willingly and emphatically stipulating that, if it follows the expected adoption curve (and if important barriers such as its seemingly insatiable power consumption are successfully addressed), properly applied generative AI will be a massively positive and transformative force for good, both in academia and in industry, rivaling any invention of the past 100 years. With more training data and a declining dollar-per-inference cost structure, hallucinations will be a mere footnote, and the “uncanny valley” of writing will fade. Already, Open AI has begun to address the latter issue with a special “AI Humanizer” that will take standard ChatGPT output and apply a more realistic, humanlike voice and tone. It’s clear that the quality of the tools, their output, and their applicability are headed in only one direction.

In that context, many of my well-meaning colleagues in other disciplines at BU have adopted a much more pragmatic approach to generative AI. Many are researchers who teach psychology, marketing, physics, law, and medicine. “What’s the harm? Everyone will be using it,” they say, perhaps understandably relieved that the quality of the term papers they assign and read will be much improved.

However, this “I welcome our new overlords” philosophy overlooks the fundamental question of education: Are we focusing exclusively on the output itself or on teaching the process of reliably and consistently — and independently — creating that output? In writing classes such as the ones I lead at BU, the answer is blindingly obvious. We want to (and must) teach students the process of writing, which, in reality, is the process of thinking (and then crisply and memorably articulating those thoughts). We must simultaneously teach all of the dimensions of writing: content, structure, mechanics, audience, and voice. When the student outsources writing to a machine, he or she is also outsourcing the thinking behind the writing. The student is focusing solely on the output, whose quality and merit they never become equipped to evaluate.

What have we learned two years into this new paradigm? We’ve learned that it’s more essential than ever to recognize — and distressingly easy to overlook — that generative AI does not teach students how to write. It teaches students to avoid writing.

Consider the (admittedly imperfect) analogy of the calculator. In the 50 years since Texas Instruments launched the first mass-market four-function calculator, we’ve seen a broad and undeniable decline in numeracy across our society. Arithmetic skills have steadily eroded to the point where dinner checks are presented (or, increasingly, we get the dreaded tablet flip) that “helpfully” include a range of precalculated tips to save us all from the burdens of simple multiplication.

No one would reasonably contend that giving a second-grader a calculator and showing her how to press a few buttons means that she has learned long division. Similarly, we can’t teach any student how to write simply by directing her to start a ChatGPT session. What skills are developed? What original thoughts can possibly emerge?

A return to pen and paper — or the typewriter — and live writing exercisesBut what’s a beleaguered instructor to do when facing 20 novice writers just starting their college careers? I’m increasingly convinced the future lies in the past. The reality is, when we’re teaching writing fundamentals to students, we must eliminate the use of computers and their attendant temptations. I, for one, plan to conduct more in-class writing workshops — using pen and paper — so that I can more effectively engage students’ abilities, confident that I’m looking at work that is solely their own. I even envision a “writing lab” with typewriters where students devote time to cranking out hard copy like it’s 1979.

I’m a major proponent of the use of generative AI. I’ve seen its power and potential, which, in the right contexts, far outweigh the perils. But in the walled garden of teaching a student how to write effectively, generative AI has absolutely no role and can lead to a stunted generation unable to articulate the thoughts in their heads.

Comments are closed.