Plagiarism detection software often ineffective
- Studies of plagiarism detection software conducted over the last decade in the U.S. and Berlin have found little improvement in plagiarism detection or the prevalence of false negatives.
- Inside Higher Ed reports that University of Texas at Austin writing coordinator Susan E. Schorn, who tested Turnitin in 2007 and 2015 and found similar dismal results, says continued use of the software despite widespread knowledge of its failure amounts to lying to students to teach them about academic dishonesty.
- The Council of Writing Program Administrators has also criticized use of the software for altering the role of writing instructors into one of enforcement rather than coaching.
While Schorn shared her original study with other writing instructors only, Inside Higher Ed received her latest results and a copy of the earlier study. The 2007 study ran six test essays through Turnitin, SafeAssign, and Google, finding that Google’s free search far out-performed the proprietary options, which barely identified half of plagiarized sources. In 2015, Schorn only re-tested using Turnitin, finding no improvement in the software.
Administrators must decide whether to purchase software like Turnitin or SafeAssign to help writing instructors identify plagiarism, and studies like Schorn’s should be considered in that process.
- Inside Higher Ed What is detected?
Follow Tara García Mathewson on Twitter