Academia, Campus, News

Digital essay grading not in BU’s immediate future

MOOC platform edX has developed a new software that may help professors grade essays and student responses. PHOTO ILLUSTRATION BY XIAOMENG YANG/DAILY FREE PRESS STAFF
MOOC platform edX has developed a new software that may help professors grade essays and student responses. PHOTO ILLUSTRATION BY XIAOMENG YANG/DAILY FREE PRESS STAFF

Massive open online course platform edX may change the nature of grading at colleges such as Boston University through a new digital essay-grading system that eliminates the time taken by traditional essay grading, officials said.

Jack Ammerman, university librarian and member of BU’s Council on Educational Technology and Learning Innovation, said while it is possible BU officials will incorporate such grading techniques into BU’s grading system in the future, the software must first be proven to be a successful academic tool.

“The focus of our efforts at BU is to enhance the residential and educational experience,” Ammerman said. “Before we decide to implement such technology, we must be sure that its grading matches the quality of a professor’s.”

Ammerman said it is unlikely that such changes to the grading system will happen in the near future because BU does not want to employ an inferior product into its educational system.

“The system is still in ‘test mode,’” Ammerman said. “So BU has not yet really explored the possibility of implementing this technology into our grading system.”

The edX software, which The New York Times reported on Friday, is an open-ended assessment tool designed to help teachers grade papers and short-response questions with artificial intelligence, said Vik Paruchuri, edX machine learning engineer. The software would provide students with immediate feedback, unlike regular grading, which is not done instantly.

Paruchuri said the software has received criticism because some people have voiced concerns that it could remove the personal element from grading. However, he said the software is more complex.

“Although this technology uses artificial intelligence, it is not the totality,” Paruchuri said. “This software combines peer- and self-assessment, along with machine assessment.”

Paruchuri said edX officials incorporated different types of assessment into the new technology since certain types are already used in classrooms and have pedagogical value.

He said professors would have control over how students’ work is assessed, as professors can design an individualized rubric by which the work is graded.

“The rubric has proven to be an excellent tool because professors are able to define exactly what they are looking for,” Paruchuri said. “They can also define how they would like the piece to be assessed, whether through self, peer, machine or all three.”

Paruchuri said while the essay grading technology is not yet fully developed, his edX team is currently working alongside professors to see what works and what does not. He said so far, research shows that the software has worked well with some science courses.

“There is no doubt that this technology is not right for all subject matter, such as humanities courses,” Paruchuri said. “However, we tested these technological components in a chemistry course …  and students reacted very positively.”

Bryn Sfetsios, a third-year School of Law graduate student, said receiving instant feedback would be beneficial, but the technology may not be able to accurately grade student responses.

“I would only trust such a software’s feedback for basic grammar, but not for content,” Sfetsios said. “This technology seems like it will focus on tag words, and essays are more about how a student creates an answer rather than the specific words they use.”

She also said the software could pose difficulties if a student wished to dispute his or her score, as the professor may have never read the essay in question.

Molly Trillo, a College of General Studies sophomore, said if she knew a professor used this system, it would deter her from taking his or her class due to the lack of personal interaction the technology may cause.

“I’m dyslexic so there are papers I’ve handed in where I’ve accidentally put in the wrong words,” Trillo said. “My professors are aware of my dyslexia and take it into account, but a computer would not know that.”

Trillo said this software could be appropriate in some classes, but not others.

“This automated grading system could be good for purely factual-based classes, but it could be problematic for opinion-based classes because those require more creativity,” Trillo said.

Adarsh Parikh, a Sargent College of Health and Rehabilitation Sciences sophomore, said he would like the technology if it could be proven to grade responses in a fair and accurate way.

“Half the time I don’t even take feedback seriously, so I wouldn’t mind the system if it was efficient and graded based on a fair rubric,” he said. “In the end, I just want a good grade.”

Website | More Articles

This is an account occasionally used by the Daily Free Press editors to post archived posts from previous iterations of the site or otherwise for special circumstance publications. See authorship info on the byline at the top of the page.

One Comment

  1. In the same vein as essay grading software comes the Poetry Assessor: http://www.poetryassessor.com. The idea of this application is that a magazine editor can accept huge quantities of poetry submissions and automatically scan them for evidence of quality. The editor can then select the top 5-10%, depending on time and resources, to assess in the traditional way. The benefit of this approach to the poet is that it potentially uncovers the diamonds in the rough which may not even have been read had it not been for the ability of the editor to solicit high numbers of submissions.