TurnItIn to identify authors through machine learning
The service will build a writing profile for each student
TurnItIn, the anti-plagiarism essay submission system used by USyd, plans to bring out a new product called later this year to coincide with the start of the American school year.
The new software, ‘Authorship Investigation’, will use machine learning to monitor and learn the writing styles of individual students and flag up content which shows considerable divergence from their previous work.
The product has been developed primarily in Australia, with Deakin University, Griffith University, UNSW, the University of Queensland, and the University of Wollongong advising Turnitin on the development of the new product, along with the University of California San Diego and the University of Northampton.
This Australian influence was spurred on following a 2014 Fairfax Media investigation that revealed up to 1000 students from 16 universities had hired the Sydney-based MyMaster company to ghost-write their assignments and sit online tests. A similar investigation by UK’s The Daily Telegraph revealed that up to 20,000 students are purchasing essays from online writing services, known as ‘essay mills’, with some paying up to £6,500 for bespoke dissertations and PhD theses.
Last year TurnItIn replaced it’s ‘classic’ similarity detection program with the currently used ‘Feedback Studio’. It underwent a cosmetic makeover and re-branded its feedback and grading interface.
‘Feedback Studio’ is offered to tertiary institutions, while K-12 customers can purchase either ‘Feedback Studio’ or ‘Revision Assistant.’ It allows students to submit drafts of their work for instant assessment. Teachers assign writing prompts from a bank available in Revision Assistant. For every writing prompt in the library, Turnitin collected about 500 samples of student writing that are scored against a 16-point rubric — covering a full breadth of writing, from weak to strong writing. The system uses machine learning to adapt to each new essay and compare it to the writing samples.
It has to be questioned if kindergarten students really need to have instant, AI-based feedback on their work. If students can get instant feedback to see which areas they are potentially ‘lacking,’ surely teaching and writing styles with evolve with the algorithm, changing the way students learn. The students are no longer writing for a teacher to mark their work, but trying to draft submissions that they think the AI will find acceptable.
This machine learning software, perfected on children under eighteen years old, will now be applied to create a personal profile for your writing style. In ‘Revision Assistant,’ teachers can see student’s progression over time, and this feature will surely make its way to the Authorship program. In this dystopian future, instructors could potentially compare your work to your ‘best’ submitted prose. Are you writing as well as you ever have? Do you deserve to be marked down if you aren’t submitting your best work?
Similarly, do students write with the same profile in-class versus when pulling an all-nighter? What about under the high-pressure of exam conditions? An ANU report noted in 2016 that the widespread use of Turnitin is breeding a culture of “mistrust and anxiety..”
This product skirts the same boundaries as the University’s 2016 trial of anti-plagiarism software Cadmus that tracks students as they complete their assessments, verifying their identities using multi-factor authentication and keystroke analytics. The trial was abandoned, with Deputy Vice-Chancellor (Education) Pip Pattison noting: “staff and students hated it.”
Turnitin isn’t selling teachers and administrators a product. The marketing on their website frames the Turnitin brand less as software and more as a pedagogical lifestyle brand. In fact, the word “plagiarism” is used only twice on their home page, in spite of the fact that the tool is first and foremost a plagiarism detection service.
This service, reportedly costing institutions around $2 per year per student, allows TurnItIn to use students as unpaid labourers, writing millions of essays which they can use to refine their AI software and provide a ‘better’ product in future.
But will students benefit at all from this? Even official TurnItIn documents note that false positives can occur simply due to a student re-using research in multiple projects (as one would do when writing a thesis), or if the student had previously checked their work in a TurnItIn-owned student-focused ‘similarity-checker’. In an ideal world, this machine learning may not pose a threat to ‘honest’ students, but cheater or not: you are just an unpaid guinea pig in the maze of TurnItIn.
Directly Plagiarised From
The Telegraph UK
The Australian
Sydney University
The Sydney Morning Herald
CampusTechnology.com
HybridPedagogy.org
Honi Soit