A few days ago, Marc Bousquet posted on Facebook a link to “Technology Is Taking Over English Departments: The false promise of the digital humanities” by Adam Kirsch and published in the New Republic. Kirsch obviously doesn’t think highly of digital humanities and technology at the expense of the feel and smell of paper and the old-fashioned magic of old-fashioned reading, and Bousquet obviously didn’t think much of Kirsch’s critique. Bousquet posted on Facebook about the Kirsch article twice for some reason; to quote (can I quote Facebook like this?)
“Technology Is Taking Over English http://t.co/d21kSd5opr Ahistorical & stupid cuz comes from a lit-dh discourse bypassing rhet-comp. Duh.”
“DH added strawberries to breakfast cereal! The era of breakfast cereal is over! Moral panic in lit makes it to TNR: http://t.co/d21kSd5opr“
I agree with Bousquet: Kirsch’s piece is wrong, but it’s more than that. I think it is in places almost perfectly, exquisitely wrong. To me, it’s like a rhetorical question that falls flat on its face because of Kirsch’s many assumptions about the problems of the digital and the purity of the humanities. And this made me realize something: it’s time for me to admit that I’m actually a digital humanities scholar/teacher and have been all along. It’s time for me to put aside petty arguments and differences (I’ll get to that below) and jump on that bandwagon.
Here’s a long quote from that New Republic essay to demonstrate what I see as perfectly incorrect:
Was it necessary for a humanist in the past five hundred years to know how to set type and publish a book? Moreover, is it practical for a humanities curriculum that can already stretch for ten years or more, from freshman year to Ph.D., to be expanded to include programming skills? (Not to mention the possibility that the kind of people who are drawn to English and art history may not be interested in, or good at, computer programming.) Like many questions in digital humanities, this one remains open. But the basic emphasis on teamwork and building, as opposed to solitary intellection, is common to all stripes of digital humanists. Digital_Humanities leaves no doubt that the future of the field belongs to democratic groups, not elitist individuals:
The myth of the humanities as the terrain of the solitary genius, laboring alone on a creative work, which, when completed, would be remarkable for its singularity—a philosophical text, a definitive historical study, a paradigm-shifting work of literary criticism—is, of course, a myth. Genius does exist, but knowledge has always been produced and accessed in ways that are fundamentally distributed, although today this is more true than ever.
Once again, the “of course” signals that we are in the realm of ideology. As an empirical matter, the solitary scholar laboring on a singular paradigm-shifting work is quite real. Mimesis is not a myth, and neither is Major Trends in Jewish Mysticism, or Philosophy in a New Key, or The Civilization of the Renaissance in Italy—you can go to the library and check them out (or, if that takes too long, download them). There is no contradiction between this fact and the idea that knowledge is “fundamentally distributed.” Scholarship is always a conversation, and every scholar needs books to write books. Humanistic scholarship has always been additive and collaborative even if it has not been in the strict sense collective. It is not immediately clear why things should change just because the book is read on a screen rather than a page.
For me, literally every assumption Kirsch is making here is just wrong. Yes it has always been necessary for humanists to understand the means of production of texts, and they fail to do so at their peril. Yes we need to teach students how to use the tools they use to write with, be those tools pens or word processors or computer code. Yes teamwork and collaboration are critical in the writing process not only in terms of co-authorship but in terms of feedback from readers– literacy is a social activity. Yes things should (and do!) change when we read books on screens rather than pages. Jeez, that’s Bolter’s Writing Space and the first edition of that book was published in 1991!
Anyway, this got me to thinking that maybe it’s time I jump on this digital humanities bandwagon once and for all. (Not that anyone else cares about this– I realize this is a self-indulgent blog post that perhaps only has an audience of me, but that’s never stopped me before).
I’ve had my issues with the DH movement in the past, especially as it’s been discussed by folks in the MLA– see here and especially here. I have often thought that a lot of the scholars in digital humanities are really literary period folks trying to make themselves somehow “marketable,” and I’ve seen a lot of DH projects that don’t seem to be a whole lot more complicated than putting stuff up on the web. And I guess I resent and/or am annoyed with the rise of digital humanities in the same way I have to assume the folks who first thought up MOOCs (I’m thinking of the Stephen Downes and George Siemens of the world) way before Coursera and Udacity and EdX came along are annoyed with the rise of MOOCs now. All the stuff that DH-ers talk about as new has been going on in the “computers and writing”/”computers and composition” world for decades and for these folks to come along now and to coin these new terms for old practices– well, it feels like a whole bunch of work of others has been ignored and/or ripped off in this move.
But like I said, if you can’t beat ’em, join ’em. The “computers and writing” world– especially vis a vis its conference and lack of any sort of unifying “organization”– seems to me to be fragmenting and/or drifting into nothingness at the same time that DH is strengthening to the point of eliciting backlash pieces in a middle-brow publication like the New Republic. Plenty of comp/rhet folk have already made the transition, at least in part. Cheryl Ball has been doing DH stuff at MLA lately and had an NEH startup grant on multimedia publication editing; Alex Reid has had a foot in this for a few years now; Collin Brooke taught what was probably a fantastic course this past spring/winter, “Rhetoric, Composition, and Digital Humanities;” and Bill Hart-Davidson and Jim Ridolfo are editing a book of essays that will come out in the fall (I think) called Rhetoric and the Digital Humanities. There’s an obvious trend here.
But I think the main reason it makes sense for me (and obviously others) to think of myself as a DH practitioner is strategic. It’s all about the terminology. We can argue about whether the terms do or should matter, of course. This last winter, I taught a “topics in” graduate course on “Multimedia Writing” that will turn into a regularly offered course at the graduate level called “Writing Digital Media,” and we read Claire Lauer’s essay “Contending with Terms: ‘Multimodal’ and ‘Multimedia’ in the Academic and Public Spheres.” I thought it was an interesting and useful essay, especially as my colleagues and I wrestled over the name we wanted to give this course as a permanent offering, but my students mostly thought it was splitting hairs: who cares what you call it? But the fact of the matter is the terminology matters a great deal, and often in unfortunately important ways.
A simple example: if you go to the National Endowment for the Humanities website and do a search for “computers and writing,” you get exactly zero hits. If you do a search for “digital humanities,” you get hundreds and hundreds of hits. So, which terms matter?