Some miscellaneous thoughts on the iPad while I watch the intro video

#1:  Clearly, there was not a woman on the development team. Already all the the “feminine hygiene” jokes have been made, and I am quite confident that a woman on the team would have suggested the “problem” with the iPad name.  But beyond that, note that this intro is a bunch of white guys.

#2: I still await a device where I can store, read and make notes on PDFs. I think.  As I have commented/posted about before, I don’t read that many “trade store” books of the type you’d read with iBooks or Kindle, but a device where I could access the piles and piles of marked-up PDFs of journal articles I use to teach would be very VERY useful to me.  I don’t think this does that yet.  On the other hand, since this thing is tied to the open-source ePub platform, I suspect that there will be some way to convert PDFs relatively easily relatively soon.

#3: I think this is more of a “netbook” than it is a giant iPod. I say that because you can add a keyboard and because the keyboard that’s built in for stuff seems pretty workable, and also because I think you’d use this pretty much the same way you’d use a netbook:  some surfing, some reading, watch some movies, some email, some facebook, some games, etc., etc., all in a very portable package.  Every situation I can imagine using a netbook would be a good one for the iPad, I think.  Or maybe the iTouch is just a tiny netbook.

#4: I’m pretty sure I want one. And I am also willing to be one of the first kids on the block with one at this point, even though I am well-aware that something much better will come out in about a year.  I want to play around with it and do some more research first, but the $500 entry-level price point surprised me.  Anyway, do me a favor and talk it up as a good idea with my wife.

Oh yeah? I planned it so I wouldn’t have so many readers/friends!

From a couple of different places, I came across this Mashable article, “Your Brain Can’t Handle Your Facebook Friends,” suggests that according to Dunbar’s number, the number of people you can really be “friends” with is 150.  This reminds me of article by Clive Thompson in the current issue of WIRED, “In Praise of Obscurity,” in which he talks about how when an audience becomes too large, it no longer is “social.”  He uses the example of a popular Twitter-er (???) named Maureen Evans who started tweeting recipes, became hugely popular (13,000 followers), and said the conversation between users just stopped. I’ll post a link once WIRED puts one up, probably when the next issue comes out.

First off, I blogged about this very phenomenon back in 2007 here, in talking about both Facebook and also and my struggling (dying?) “Blogs as Writerly Spaces” project.  (Perhaps I can count this post as something that will allow me to check off “worked on scholarship today” from my to do list.)  As I noted back then, since I think the readership of this blog is generally pretty small, I don’t need a lot of rules; on the other hand, with, especially when it was routinely getting 600-1000 hits a day (that’s fallen off to about half of that now), I did indeed need to set up rules.  In that sense, the Dunbar number seems to be about a threshold for organization as much as anything else.  If you have a group of people who like to play ultimate frisbee or pick-up basketball or softball every Friday night at a particular park and that group is less than 150 or so people, then you probably don’t need much in the ways of “rules.”  But if that group gets above 150, then I suspect you need to start forming a “league” with organized teams, schedules, etc.

Second, this all begs once again the definition of “friend,” something that has been a little easier to sort out with Facebook as of late thanks to its new “list” feature.  I think in the context of Facebook, people have basically over-valued and/or misinterpreted the word “friend.” In “real life,” I think of a friend as someone I either know quite well and engage in activities with on a regular basis (e.g., family friends, golfing friends, people I invite to my house for a party or something, etc.), people I know pretty well but only catch up with once in a while (e.g., many/most people at work, friends who live some distance away, etc.), or people I still know but are from a more distant past and who I haven’t necessarily even spoken with in some time.  This last category is a big one on Facebook:  we all have “friended” people from high school or college who we haven’t seen or spoken with in decades and who we aren’t especially interested in reconnecting with in “real life” again now, but who are still a kind of friend.

I have “real life” friends on Facebook, but besides “real” friends, most of my Facebook friends fall into the categories of “colleagues in my field,” people at EMU, and/or students.  No offense to any of these folks, but that y’all aren’t really my friends in the real world friend sense, right?

Third, I guess the other thing that comes up especially in the Thompson article is my concept/understanding of who I am “speaking” with when I post online, be that space on Facebook, Twitter, this or some other blog.  This may be kind of “old skool,” but I still work from the assumption that anything I post online has the potential to be read by anyone on the planet; therefore, I would never post any sort of personal thing which I would be concerned about some stranger reading.  You’re not going to get any “weird rash on my hands not going away” posts from me (btw, I have no rashes).  And if I post something like “ate tuna sandwich,” it is only because I don’t really care if anyone knows that I ate a tuna sandwich.

The tricky thing about this is trying to figure out those borders between the actually personal, the things you really would only tell to close friends, and everything else.  This is nothing new, of course; what makes it a little different now is that the sheer volume of people on networks like Facebook means that there is inevitably a learning curve for both writers and readers about the shifting definition of “Too Much Information.”  I mean, I have FB “friends” who do seem to think that posting about that mysterious rash is fair game; conversely, I also have FB “friends” who would comment on my lunch selection “Ew, TMI.”  So it goes with emerging medias, right?

BTW, today I’m going to have left-over pork loin for lunch.  If it isn’t too freezer-burned.

Three thoughts on poly-ticks

Thought (frustration, really) #1: Reagan, both Bushes, and Clinton never had close to 60 votes in the Senate and they got stuff done.  What is wrong with the current Democratic leadership– Obama, but also the folks in Congress– that they can’t get things done?  Haven’t these people done this before?

Thought #2: I think the main reason why the Democrats lost the senate race in Massachusetts (and btw, I think they lost rather than the Republicans winning) boils down to “hubris.”  Democrat leadership in DC and in Boston simply assumed that it wouldn’t be possible for a Republican in bluer than blue Mass. to win “the Kennedy seat” in the Senate and they assumed they could have run a potted plant for the job and win.  Hubris, and the lesson should be to take every election seriously and don’t assume anything.

Thought #3: I am (or at least vote) Democrat for all sorts of different reasons, not the least of which is I identify with the progressive ideals, the empathy for my fellow citizens of the country and the world, the thoughtfulness of the approach, etc., etc.  The Democrats (at least the current version) is the “thinking person’s party.”  In contrast, the Republicans– especially in this particular instance of debating health care and the senate race in Mass.– tap into the “reptilian brain” that is in all of us and below the levels of reason.  The Republicans know that people respond unconsciously and powerfully to fear and self-interests.  And I have to say I think that the Democrats are going to have to make at least a nod to the reptile brain that is (unfortunately) a bit too forward in too many Americans if they are going to hold in 2010 and/or win in 2012.

Lotsa links/reader round-up

I have been procrastinating from cleaning my office by a) teaching (well, that’s kinda my job, so that doesn’t count as procrastination), and b) looking through some piled up google reader links.  So in an effort to put off office cleaning a bit longer, here’s a bunch of links in no particular order:

Okay, cleaning will commence.  Soon….

As the happy academic, I contemplate the profession’s journey to hell in a handbasket. Or not.

I’ve been working all day trying to figure out what my classes for the winter term (which starts tomorrow) are going to look like.  I was going to write “working my ass off,” but let’s face it:  working in academia isn’t exactly manual labor, a point I’ll return to in a moment.  It involves a lot of sitting, a lot of thinking, a lot of reading online and on the page.  It’s fun.  Hitting the gym and eating right to reduce the size of previously mentioned ass– now that’s work.

Anyway, earlier today via Facebook and Twitter, I came across this CHE article by Thomas “not his real name” Benton, “Graduate School in the Humanities: Just Don’t Go.” It’s an article about why getting a PhD in “the humanities” in general is a bad idea, and it comes on the heels of a number of articles about how dreadful the job market is for academics at the MLA and, as this piece in Inside Higher Ed suggests, fields like history and economics as well.  I agree with at least two things in Benton’s article:

  • A lot of potential graduate students in his and my generation received bad advice.  “Having heard rumors about unemployed Ph.D.’s, some undergraduates would ask about job prospects in academe, only to be told, “There are always jobs for good people.” If the students happened to notice the increasing numbers of well-published, highly credentialed adjuncts teaching part time with no benefits, they would be told, “Don’t worry, massive retirements are coming soon, and then there will be plenty of positions available.” The encouragement they received from mostly well-meaning but ill-informed professors was bolstered by the message in our culture that education always leads to opportunity.”  I think that’s spot-on, and it makes me glad that my entry into graduate work in the late 198os was in an MFA program– not that that was a great career move, but the stakes were a lot lower than a PhD, and it was useful in lots of other ways.
  • Getting a job as a professor– particularly a humanities/literature professor– is not as easy as getting the degree, and getting the degree isn’t that easy either.  “They don’t know that you probably will have to accept living almost anywhere, and that you must also go through a six-year probationary period at the end of which you may be fired for any number of reasons and find yourself exiled from the profession. They seem to think becoming a humanities professor is a reliable prospect — a more responsible and secure choice than, say, attempting to make it as a freelance writer, or an actor, or a professional athlete — and, as a result, they don’t make any fallback plans until it is too late.”  Also very true, and I like the comparison of being a professor to these other less than “sure thing” professions.  You want a “sure thing” at a job where you can make good money, live almost anywhere, work on your schedule (within reason), and help people?  Be a nurse.

But as I skimmed and reskimmed the article during my day, while I was putting together the previously mentioned syllabi for English 328 and English 516, I got to thinking a bit more.

Continue reading “As the happy academic, I contemplate the profession’s journey to hell in a handbasket. Or not.”