"Stories are what we live in" wrote Kerwin Klein. So here is one story. A couple of weeks ago, Scott Petri, sent out a query on twitter about robograders. I asked, in response, why anyone would participate in their own de-skilling. After a bit of back and forth, we returned to our lives and families. A few days later Scott asked if I would co-host a twitter chat on robograding and perhaps I would do a couple of blog posts on the subject. I said sure. Scott and I set up a Skype chat to touch base and he sent me a couple of ed-school articles to read. I didn't read them. I was confident in the rightness of my views. Robograding, I was convinced, was the devil's tool, designed to aid the depersonalization of the classroom and further diminish the wide range of skills that go into teaching. Robograding, to me, was the beginning of the mechanization of teaching in a way that the sewing machine introduced the mechanization of clothing production. It was a story that labor historians have been telling for a long time. It brought up images of bubble tests and scantron and something of which I wanted no part.
But that's not the story I'm telling today.
So here's a different story. This is one about a TA in a Big 10 grad program. He had really bad handwriting. He had a hard time grading blue book exams. To increase his own efficiency and decrease his students frustration with his handwriting, he started numbering the comments in the blue book, typing them up on his computer, and then printing them. Each student got about one third of a page of feedback, stapled into their blue book. And thus, in 1994, I began my forays into computer assisted grading. Since then, I've been employing a number of computer assists to my grading. In addition to Microsoft Word, I've used a variety of technologies. I currently use Google Docs, Haiku LMS, and Turnitin.com to give feedback. I can't imagine not grading on a screen. In short, I do a lot of my grading with computers. I prefer to call it computer assisted grading. Of course, I'm lying to myself. I'm using machines to help me teach more efficiently. I'm robograding.
So if we are going to have this conversation about robograding, it can't be judgmental. We have to ask: Why do we robograde? What do we want the machines to do for us as teachers? What do we want the machines to do for our students? And perhaps, most importantly we ask: What don't we want the machines to do? What lines don't we want them to cross? What prices are we willing to pay? What prices are we not willing to pay?
Many, many years ago the great historian and provocateur Virginia Scharff made me read Donna Harraway's Simians, Cyborgs, Women. At the time, I missed the point. I was deeply enamored of Judith Butler's ideas on the social construction of gender and Haraway's ruminations on gender seemed unnecessarily complex and unnecessary. If gender were a social construction, the biology didn't mattter. But Haraway recognized that technology was changing our very definitions of self, enabling new discourses and constructions. She anticipated Katelyn Jenner and the trans-rights movement.
Just this week the great historian and provocateur Audrey Watters challenged us if "It is Time To Give Up on Computers in Schools." Watters reminds us that technology is never value neutral and teachers need to take control of it lest the ed-tech overlords continue to have their way with us and our students. Down that latter path lies the dystopian future I think of when I hear the word "robograding."
Which brings me back to Harraway She challenged us to ask "How might an appreciation of the constructed, artefactual, historically contingent nature of simians, cyborgs, and women lead from an impossible but all too present reality to a possible but all too absent elsewhere?" She challenged us to unite as "Cyborgs for earthly survival!"
By having this conversation, I hope we can take control of computer assisted grading for our own purposes and for the futures of our students, our schools, and our country.
Happy Fourth of July.