As might be obvious, I like to write. Writing enables me to think things through, to come to grips with the errors of my views and the beliefs that cannot be justified or explained. In other words, I don’t know what I think until I see what I write. (Sorry, Bill.)
At Techdirt, Mike Masnick tells an apocryphal story about his child.
About a year and a half ago, I wrote about my kid’s experience with an AI checker tool that was pre-installed on a school-issued Chromebook. The assignment had been to write an essay about Kurt Vonnegut’s Harrison Bergeron—a story about a dystopian society that enforces “equality” by handicapping anyone who excels—and the AI detection tool flagged the essay as “18% AI written.” The culprit? Using the word “devoid.” When the word was swapped out for “without,” the score magically dropped to 0%.
The irony of being forced to dumb down an essay about a story warning against the forced suppression of excellence was not lost on me. Or on my kid, who spent a frustrating afternoon removing words and testing sentences one at a time, trying to figure out what invisible tripwire the algorithm had set. The lesson the kid absorbed was clear: write less creatively, use simpler vocabulary, and don’t sound too good, because sounding good is now suspicious.
At the time, I worried this was going to become a much bigger problem. That the fear of AI “cheating” would create a culture that actively punished good writing and pushed students toward mediocrity. I was hoping I’d be wrong about that.
Turns out… I was not wrong.
By using AI to vet students’ writing for AI, schools are pushing students to use AI to vet their work first so they aren’t accused of AI cheating. Even students who wanted nothing to do with AI feel compelled to use it defensively, if not offensively. And just like that, the guy who came up with the Streisand Effect raises the Cobra Effect.
This is the Cobra Effect in its purest form. The British colonial government in India offered a bounty for dead cobras to reduce the cobra population. People started breeding cobras to collect the bounty. When the government scrapped the program, the breeders released their now-worthless cobras, making the problem worse than before. AI detection tools are our cobra bounty. They were supposed to reduce AI use. Instead, they’re incentivizing it.
So rather than incentivizing students to work hard and write well, students are being taught to dumb down their writing, use words that are more common, imprecise or vague, and be mediocre lest they be tagged by AI as being a cheater by using AI to write better.
We are teaching an entire generation of students that the goal of writing is to sound sufficiently unremarkable! Not to express an original thought, develop an argument, find your voice, or communicate with clarity and power—but to produce text bland enough that a statistical model doesn’t flag it.
The word “devoid” is too risky. Em dashes are suspicious. Confident prose is a red flag.
As Mike, and pretty much every academic, recognizes, many students do use AI to cheat. Indeed, at dinner last night, I was told by a student at Columbia about how his classmates, many of whom spoke badly broken English, were turning in lengthy essays in perfect prose. Why bother going to school if you’re not going to learn anything? The answer, of course, is to get the credential, which matters far more to them than knowledge or capability since AI is there to do the heavy lifting.
As much as lawyers may (or may not) burn their reputations or clients on the alter of hallucinated citations and quotations, there appears to be no way to eradicate the involvement, either offensively or defensively, from academia. If students aren’t using AI to do their work, they’re using AI to protect their work from being flagged as AI by using smart words. They can’t win, and people who can’t win stop trying.
The counterargument is what’s wrong with AI doing what is does? The short answer is that while AI might be generally better than the average human, it cannot surpass mediocrity. And with AI in education, there is a strong likelihood that the same will be true of students, no matter how smart or capable they may be, where AI will be their ceiling one way or another.