Beware the AI Shortcut | University of Portland

Beware the AI Shortcut

Portland Magazine

April 25, 2024

Advice from a national best-selling author on Artificial Intelligence in education and literature.

Story by Jess Walter

An illustration of a square grey building that resembles a microchip at the center of a green space with various paths entering and exiting the building. Students entering the building all appear differently, whilst those leaving the building all look the same.

Illustrations by Violet Reed

I GET A lot of letters and emails from readers, but this is my favorite kind:

Dear Mr. Walter,

My English 101 class has been assigned to read your story Mr. Voice and I was just curious: What would you say is the theme of that story? Also, the protagonist’s motivation? And could you identify any new vocabulary that you encounter?

I always answer such “help-me-with-my homework” emails. To me, these students are only half-slacking. And if my answering their email helps them comprehend the story, what more can I ask? Plus, it took real work for that student to track me down and write that note.

I helped one student with her paper on my book, We Live in Water, and she was thrilled to get a B. I was not. I almost called the professor. (Look, I know Caitlin sometimes takes harmless shortcuts, like contacting the writer for help, and God knows I’ve done B-work in my time, but if there’s one subject on which Caitlin and I deserve an A, it’s the work of … well … me.)

I tell you all of this because I’ve been thinking recently about Artificial Intelligence, its uses and abuses in academia and in the literary world, and the difference between seeking help and seeking a shortcut.

This fall, I found out that half of my books had been pirated—used without my permission—to help train new Artificial Intelligence systems like ChatGPT. In all, something like 183,000 books were fed into the “Books3” dataset, five of them mine.

This was the fuel, taken without the owners’ knowledge, used to create perhaps the most valuable technology since the advent of the Internet.

Of course, I felt the same outrage that thousands of other authors have felt when I found out about my books. Using them to “train” self-learning computers, which can then reproduce the intellectual property of those same unwitting writers, is outright theft, a clear violation of copyright, and a slap in the face of human creativity.

I also share the fear that self-learning machines are an existential threat to humanity. These fears may seem a little nebulous right now, but AI is already creating vicious malware, finding new ways to violate personal privacy, and spreading deepfake misinformation. And technology experts theorize that weapons development and automated warfare might be the most lucrative applications of AI. In the hands of terrorists or rogue state actors, what could go wrong?

But even as I signed an open letter and joined the Authors Guild lawsuit protesting the theft of those 183,000 books and calling for further regulation of AI, I have to say, I had mixed feelings about my artistic objections.

My first thought was: only half my books? What was wrong with the other five? Did the computer start them and think: Eh. Kinda boring. Derivative. Not my cup of data.

I had mixed feelings for another reason, too.

Literature is a kind of database, the full measure of human wisdom and knowledge, added to by each successive generation of writer. My work doesn’t exist without the books that came before it and their influence on the way I think, the way I write, the world I see.

But when human beings create something new using those vast stores of wisdom and creativity, they bring themselves into the equation, their inimitable feelings, thoughts, and experiences. Their selves.

AI has no self. It is merely the sum of its stolen parts, approximating millions of human interactions input into its memory. AI can create remarkable imitations of human creativity (and some, not so remarkable) but, as the great Elvis Costello puts it: “Imitation is the sincerest form of plagiarism.”

On campuses, a big issue, of course, is students using AI programs to write their papers. On this, I have no mixed feelings. I want to be clear.

Do not do this! It is unethical and lazy. And the worst part is: the only person you’re cheating is yourself.

ChatGPT can probably write you a decent C+ paper on Toni Morrison’s Beloved, but you will experience none of the horror and humanity of that haunting novel.

You may say, what’s wrong with taking a few shortcuts? Well, recently, the automotive industry had a crisis. People were dying from their shortcuts—the ignition switches on GM cars burst into flame and 124 people died; Volkswagen engineers outright lied and falsified emissions standards and experts feared that as many as another 1,200 more might eventually die from increased exposure to exhaust fumes.

Studies found that engineers had been under so much pressure to get good grades and get accepted into the best engineering programs that they had created a culture of cheating and shortcuts that extended into their careers, and to the culture of engineering itself, with disastrous results.
 

Now, luckily, your C+ Beloved paper isn’t going to kill anyone, except perhaps your poor English professor, forced to read paragraphs that start, “Moreover, taken in its totality…” over and over in computer-generated papers.

But what does get lost in the long term are qualities like intelligence, civility, and empathy.

And the person who gets cheated in this equation…is you. You paid thousands of dollars to learn what Toni Morrison had to teach you about the evils of slavery and you gave that valuable lesson away. To your garage door opener.

Imagine you’ve paid for a tour of Rome, written the check, flown to Italy, and now you sit in your hotel room looking at pictures of the Colosseum on Wikipedia. That’s what using “artificial” intelligence is like. Artificial. And not very intelligent.

Here’s a little secret: no one really cares about your college grades—not your parents, or your teachers, or your graduate school, or your future employer. It’s just a crude way to measure whether you paid attention to the education that you paid for. (Perhaps if there is a bright side to AI, it’s that it might lead to rethinking arcane, corruptible grading systems.)

As Socrates said: “Education is the kindling of a flame, not the filling of a vessel.” AI may be a big vessel, but it’s no flame.

Look, I’m not some luddite who thinks you shouldn’t use a computer in physics, or that all your work should be shown on a slate tablet. But we all know the difference between learning tools that help you comprehend material and those that help you avoid it.

Stumped by a work of literature? I wouldn’t hesitate to read the online summary, or the Cliffs Notes (then go back and read the novel or play or poem). Hell, you can even write the author, assuming they’re alive. And with any luck, they may just write you back.


JESS WALTER, the fall 2023 Schoenfeldt writer, is the author of ten books, including the best-selling novels The Cold Millions and Beautiful Ruins, and, most recently, The Angel of Rome, a collection of stories.