'Day 318: Blood, Sweat, and Tears' photo (c) 2007, Quinn Dombrowski - license: http://creativecommons.org/licenses/by-sa/2.0/

One of the latest hot new ed tech topics, encouraged along by another questionable use of private foundation dollars, is developing software so computers can grade essays. That’s going to certainly energize my students to write — instead of working as I do now to try to identify authentic audiences for their writing, they can plan on not having even a single human’s eyes review it. And it entirely misses the point of strategically supporting writers who might be facing lots of challenges in their writing. For example, having an English Language Learner or someone who has little writing success in their life get tons of mistakes pointed out every time they write something is not going to do much for student motivation.

What do you think?

Here are the best pieces out there that I’ve been able to find on the topic:

Automated essay scoring on state writing tests: as efficiently “meh” as human graders by Sherman Dorn is the most thoughtful post I’ve seen.

Man vs. Computer: Who Wins the Essay-Scoring Challenge? is from Ed Week (be sure to read the comments).

Robo-readers: the new teachers’ helper in the U.S
. is from Reuters.

Here’s a Storify documenting a discussion on this topic between Will Richardson and Justin Reich.

The tests that can be computer scored appeared in Joanne Jacobs blog (again, please read the comments).

Grading essays: Humans vs. machine
appeared in USA Today.

A Win for the Robo-Readers is from Inside Higher Ed. Thanks to Joe Fahs for the tip.

Justin Reich is writing a three part series on this topic over at Ed Week.

Facing a Robo-Grader? Just Keep Obfuscating Mellifluously is an excellent article in The New York Times. The Director of Writing for M.I.T. is quoted in it.

Computer Scoring Open Ended History Questions is by Tom Hoffman.

Robot Eyes As Good As Humans When Grading Essays is a strange title for an NPR interview with NY Times writer Michael Winerip.

Machines Shouldn’t Grade Student Writing—Yet is from Slate.

More Important Things to Do With Student Writing Than Just Grade It is by Renee Moore.

How ETS’ computer-based writing assessment misses the mark
is by Maja Wilson.

Lies, Damn Lies, and Statistics, or What’s Really Up With Automated Essay Scoring is by Todd Farley.

Computers Grade Essays Fast … But Not Always Well is from NPR.

Humans Fight Over Robo-Readers is from Inside Higher Ed.

Essay-Grading Software Offers Professors a Break is from The New York Times.

How Would You Feel About a Computer Grading Your Essays? is from The NY Times.

22 Thoughts on Automated Grading of Student Writing is from Inside Higher Ed.

Can computers really grade essay tests? is from The Washington Post.

Grading writing: The art and science — and why computers can’t do it
is from The Washington Post.

Burnt Offerings is from the blog Honesty, honestly…

Essay-Grading Software Seen as Time-Saving Tool is from Education Week.

New Common Core exams will test whether a robo-grader is as accurate as a human is from The Hechinger Report.

Flunk the robo-graders is from The Boston Globe.

For a different take, see Robo-reader redux: Can a curious computer improve student writing?  from The Hechinger Report.

Here are a couple of tweets from a district-sponsored training I attended:

More States Opting To ‘Robo-Grade’ Student Essays By Computer is from NPR.

Flawed Algorithms Are Grading Millions of Students’ Essays is from Vice.

TEACHER VOICE: Does AI, as a tool, deliver student feedback more effectively than the ballpoint pen? is from The Hechinger Report.

PROOF POINTS: A smarter robo-grader is from The Hechinger Report. Its subtitle says everything you need to know: “But first you have to train them, and they aren’t useful for classroom teachers.”

If you found this post useful, you might want to consider subscribing to this blog for free.

You might also want to explore the nearly 900 other “The Best…” lists I’ve compiled.