What Happens After AI Destroys Gen Ed?

In all three of the classes I’m teaching this semester (and, like last semester, I’m teaching two sections of first-year writing and Writing, Rhetoric, and AI), I am once again having students read and discuss Hua Hsu’s New Yorker essay from last summer, “What Happens After AI Destroys College Writing?” I blogged about this essay here last semester. As I wrote then, I think Hsu does a good job of capturing the anxieties that both students and teachers have about AI. Ultimately, Hsu’s answer to the question posed by the title (probably written by an editor rather than him) is that AI doesn’t “destroy” college writing, but AI is challenging a lot of assumptions about what college itself is for. Is getting a college degree simply a series of hoops to jump through to earn a degree that assures membership in the upper-middle class, or is it about learning/teaching things?

This time around, I am reading this essay a bit differently. Hsu interviews several students across the country about AI, and the one that stands out the most to my students and me is “Alex,” a pseudonym for a student at New York University, because he is so unusual. Unlike every other student I have caught cheating with AI or anything else, Alex is the unicorn of AI think pieces like this: a criminal mastermind cheater. Hsu quotes Alex as he explains how he uses ChatGPT to write papers and summarize readings for almost everything, and, if he is to be believed, Alex always gets away with it.

I don’t know, maybe Alex told Hsu the truth, and maybe there are students who have cheated like Alex in my classes that I never knew about. But as I’m re-reading and thinking about my own experiences with students talking about how they use AI or the students who try to cheat with it, I’m wondering if Hsu is taking a little “creative license.” Or perhaps Alex is a kind of composite character included because he represents the worst (and mostly exaggerated) fears about AI in education.

In contrast, most of Hsu’s interviewees describe using AI in ways that ranged from things everyone, including me, thinks is blatant cheating (for example, AI “writes” the entire paper the student does a little light editing before turning it in) to using it for stuff like brainstorming, for feedback, and other activities that some professors forbid and that I encourage. But within this range of use, most of these students were selective about the courses and assignments where they use AI, especially when it came to blatant cheating: not usually courses in their majors or courses they care about, but the courses they think are irrelevant and just another hoop. I am, of course, talking about General Education.

The role of Gen Ed in undergraduate studies has always been fraught. As a student in the 1980s, I enjoyed and learned a lot from some of the gen ed courses I took. But I also thought a lot of them weren’t worth it, and given the way these courses figured into my undergrad experience, the university seemed to have felt the same way. For starters, a lot of the Gen Ed classes were about subjects I didn’t like and/or I was not good at. Being required to take a science class with a lab felt a lot like being required eat a big bowl of (insert the name of the vegetable you will not eat here) because it’s “good for you.” These were the classes you had to take before you were allowed to have the good stuff.

Plus, the university didn’t seem to think the Gen Ed classes were that important, certainly not as important as the “real” classes in the major. I got credit for about a semester’s worth of gen ed classes in part because of some high school classes I took, but mostly because I did well on the CLEP Tests. The gen ed classes I took were mostly large lecture hall courses with a sage on the stage talking away, a couple of tests and a final. If they were smaller discussion sections, they were almost always taught by graduate students or part-timers– that is, the lowest paid and least empowered instructors on campus.

As an educator, I believe in the theory behind Gen Ed. It provides students with introductions to fundamental skills and subjects they will use in many other courses and throughout life (writing and math classes immediately come to mind), and it helps students be “well-rounded” in terms of diverse experiences, critical thinking, “broader knowledge” of the world beyond their specialization, and so forth. Back when I was a student and it was a lot more common to start college as an “undeclared” major, Gen Ed was also a way of trying out different possibilities.

But that, as they say, was then, and this is now. Attending college has never been cheap, but the cost of attendance at supposedly affordable and public universities (like EMU) has gone through the roof. Taking classes that I thought were a waste of time didn’t cost that much when I was in college; it does now. “Students these days view college as consumers, in ways that never would have occurred to me when I was their age,” Hsu writes. “They’ve grown up at a time when society values high-speed takes, not the slow deliberation of critical thinking.” I think students have always viewed college at least partially as consumers, but higher costs and the increased value society puts on speed have made this worse. Stir in the rest of the shit of the world in (social media, global warming, pandemics, depression, working too many hours to make ends meet, etc.), and the idea of a busy and stressed-out student cutting some corners in classes they think are a waste of time with AI makes a lot of sense.

As I’ve observed here and elsewhere many times before, this is the pattern of AI cheating I’ve seen in my own teaching. I see social media posts from colleagues and I read hair-on -fire MSM pieces and Substack posts about how ALL the students are ALL cheating ALL the time, but that has not been my experience. I don’t get a lot of students cheating with AI– well, unless I’ve had a bunch of Alexes that I never caught– because teaching writing as a process deters plagiarism. Plus I teach small classes (which also deters cheating), and I don’t have any “one-shot deal” paper assignments (as in “write a 5-page paper about ‘x’ and turn it in”) or any multiple choice/short answer tests or whatever, all things that AI can do well.

That said, my students do sometimes cheat, and at least 90% of the AI cheating incidents where I either required the student to redo the assignment (generally the first offense) or failed them (a second offense) have been in first year composition. The more advanced students in the courses I teach in the major or the MA program are all adamant about never cheating on their writing and certainly not with AI, and I believe them.

Like I said before, I skipped a lot of Gen Ed partly because I did well on those CLEP tests but also because I was an English major. That meant I never had to take a math class in college, and I only had to take that one science class I mentioned– astronomy, incidentally, which was the science class most students thought was the easiest option. But if I were a student now and I had to pass a Gen Ed math class along the lines of pre-calculus, I would be screwed.

Maybe I’d do the right thing and seek help from a tutor or someone, or if I got over my math phobias and if I really tried, maybe I could eek out a passing grade on my own. But if AI could help me in various ways that might or might not be allowed by the instructor, that might cross the line from “help” to doing the work for me, and I thought I could get away with it, I would use AI. And if I had to take some kind of class that I thought was a waste of my time, and it turned out that all of the assignments for the class– quizzes, tests, essays, etc.– are ones that anyone could complete with AI, then I would use AI, at least to help cut some corners. Why wouldn’t I?

“Because that wouldn’t be ethical!” you might be thinking. Sure, but if my options are to take the risk to cheat or to fail out of college, I’m going to take the risk. And really, isn’t it unethical (or at least “problematic”) for universities to make students go up to their eyeballs in debt and to spend all that time taking courses that are nothing but a series of busywork and hoop-jumping assignments that some clanker could do just as well?

Now, I don’t really think AI “destroys” Gen Ed (see what I did there?), but it clearly changes how we should teach these classes and also what these classes are for. To me, that mostly means teaching in ways that minimize the usefulness or temptation of cheating with AI: small classes focused more on learning processes rather than making products, and writing assignments that value reflection and research, especially over simple regurgitation of whatever the professor said in the lecture will be important on the test.

We also need to make these classes “real” and as meaningful and important as the classes students get to take in their majors. I can’t begin to imagine what the solution is to all this, but I suppose the first place to start with rethinking Gen Ed is to take a hard look at the courses and their formats. A class that can be successfully completed by someone who was never actually a student in the class– that is, there’s no participation component, there’s no specific context for the subject, there’s no assessment of material that is only delivered/discussed in the class–is a class that could probably be completed with AI. And it is also probably a pretty shitty Gen Ed class.

Leave a Reply

Your email address will not be published. Required fields are marked *

Time limit is exhausted. Please reload CAPTCHA.

This site uses Akismet to reduce spam. Learn how your comment data is processed.