Sunday, January 3, 2021

Show-Your-Work to Discourage Students Cheating Online

Like many, it appears my students will be studying away from the face to face classroom again in 2021, at least for the first part of the year, due to COVID-19. So I have been considering how to improve online delivery, and in particular how to improve the experience of assessment for students while deterring misconduct. One way may be to require students to "Show Your Work" in assessments. 

Some Literature

As Nguyen, Keuseman and Humston (2020) note online assessment was suddenly a major concern due to COVID-19. Some techniques they explored for chemistry students were short answers and multiple-choice tests to investigate higher-order thinking. They also looked at more frequent lower-stakes tests. This is possible as they are easier to administer online than face to face. The frequency, the authors suggest, may help with student learning and the lower stakes reduce the temptation to cheat. Interestingly the frequency is also claimed to reduce procrastination, as students are forced to study to pass the frequent small tests, rather than wait until a large test. 

 Nguyen, Keuseman and Humston (2020) also briefly looked at the use of an academic integrity pledge. However, they concluded this was only effective at an institution that has a culture of academic integrity. This seemed curiously circular reasoning: an institution that did not have a culture of academic integrity presumably would not have an integrity pledge.

To identify misconduct in online STEM courses, Sangalli, Martinez-Muñoz, and Cañabate (2020) analyzed the log from a learning management system of students undertaking exercises. They found co-occurrence of responses to exercises was an indicator of collusion, with pairs of students answering the same questions at the same time. Of course, this might give a false positive for students who are studying together.

Catalena (2020) was able to use the pattern of submission of student's coding assignments to identify plagiarism in a computer course. As students can submit repeatedly to a system that validates their code, it is possible to distinguish those who make incremental progress, from those who submit completed work. However, while submission of completed work without intermediate steps may indicate the student cheated by obtaining a correct answer from someone else it may be they used some other system for refining their answer, or they are exceptionally talented. 

Pribela and Pracner (p. 99, 2017) propose building a system for students to create computer code, limiting the student's ability to copy from outside the system. This makes it possible to check the consistency of the student's work. It also prevents students from forgetting to include some element in their final assignment submission, as everything is in the system and accessible to the examiner. This seems a heavy-handed way to try to prevent misconduct and one which would stop students from using outside legitimate tools. The authors produced their temporal file system, based on Git, however, most of the benefit, I suggest, could be obtained by just using Git and allowing students import to it.

Dalziel (2008) suggests students prepare an ePortfolio, including the plans, notes, and comments to deter plagiarism. They point out that if entries are timestamped it is possible to see when entries are added. The author points out that the PebblePad ePortfolio tool has a link to the TurnItIn plagiarism detection tool. However, they don't report any results of using this approach.


Students can be reluctant to start work on an assignment. Having a looming deadline, the student can be tempted to use old work of their own or someone else's, a form of poor academic practice or misconduct

Students are urged to undertake their work methodically. However, they are only asked to submit the final product, not the steps taken to get there. This signals to the student that preparatory work is not of value. Students are therefore tempted to simply try to produce their final product, skipping steps in its production. Students may also become frustrated not understanding why they can't produce a polished result instantly, not realizing they are not the only ones who have to work through draft after draft. 

An examiner competent in the discipline assisted by automated tools can look for signs of misconduct. However, this can be a time-consuming process which is stressful for staff as well as the accused student. Rather than look for poor practice, I suggest having students provide evidence of good practice.

A similar problem occurs with online examinations, where remote proctoring tools, such as ProctorU and Proctorio, have been employed. Those tools work reasonably well, but my colleagues at ANU Computer Science developed an alternative  approach of self-invigilation for online examinations. Each student is encouraged to make recordings of themselves undertaking the exam, similar to those created by proctoring tools. The difference is the student makes the recording, rather than have software imposed on them. If the examiners raise concerns about who sat the exam, the student can present the video as evidence.

I suggest this self-invigilation process could be extended to assignments through a Show-Your-Work process, as Dalziel (2008) suggests. Rather than just submit their final completed assignment, the student would make available to the examiner all drafts, notes, and other work that was used in preparing the assignment. If there was doubt as to the author of the final work, the examiner could look to see if there was a consistent body of work by the student supporting it.

The student would be required to use a system that time stamped their work, and tracked any changes, not only of drafts of the final work but all notes used. It would then be possible to look for a consistent pattern of work over days or weeks. 

This Show-Your-Work approach is already used in some group project-based courses in computer science at ANU. These require student teams to use project management software, such as GitLab. The examiner can see when work was uploaded to the system and who uploaded it, to see if this supports the team's claims.

Just as students can be videoed while undertaking an exam, a video record of all the time each student spent working on an assignment (or all the time they were studying) is technically feasible. However, this would be cumbersome to use, intrusive, and not necessary. An approach of logging all the work done on an assignment should be sufficient, as this is something the student should be doing anyway. Students in any field of study should be doing so in a methodical way, STEM, computer science, and engineering in particular. 

While it would be possible for a student to fake the assignment preparation process, it would be cumbersome. The student would have to take the working notes of someone else (or commission these to be written) and submit these to the online system over several days or weeks. If the student tried to enter all the working notes just before the deadline, this will be evident from the timestamps. 

The student could alternatively not only get the work from someone else, but have them submit it for them. This would require handing over their ID and password to upload the material. Using two-factor authentication might help deter this, where the student would be required to enter a code sent to the mobile phone. Also, the possibility of a jail term for conspiracy and computer crime may deter some students.

Asking students to show the notes and drafts of their assignment is much the same as telling students on an assignment question to "show their work". Where a student doesn't show how they derived the final result, the examiner can reduce the grade. This doesn't require accusing the student of plagiarism.

If students are asked to show their work, they will need to be guided as to what is a reasonable quantity  of work over what period. During my MEd studies, I created a journal using the institution's Mahara e-portfolio system, in which I recorded my impressions of the program. Also, I created a journal for each course, and one for my capstone e-Portfolio (in place of a thesis). In these, I made notes on references found, answers to study questions, and appended drafts of assignments. This was mostly to aid me in my work, and to have material to draw on for my reflective e-portfolio. However, I also kept this as evidence, if I was ever to be charged with academic misconduct. Fortunately, that never occurred, and my journals remain read only by me. I made an average of 100 postings per course, a total of 100,000 words. This was in addition to the assignments and capstone e-portfolio

Based on my experience as a student, a reasonable guide would be 50 words of supporting material per percent of assessment. So an assignment worth 20% of the course assessment undertaken over two weeks would require 1,000 words of supporting notes in 10 posts. This would be in addition to ten drafts of the assignments, over at least five days of the two weeks.


Catalena, K. A. (2020). Mining Student Submission Information to Refine Plagiarism Detection (Doctoral dissertation). 

Nguyen, J. G., Keuseman, K. J., & Humston, J. J. (2020). Minimize Online Cheating for Online Assessments During COVID-19 Pandemic. Journal of Chemical Education97(9), 3429-3435. URL

Dalziel, C. (2008). Using ePortfolios to combat plagiarism. Hello! Where are you in the landscape of educational technology? Proceedings ascilite Melbourne 2008. URL

Pribela, I., & Pracner, D. (2017, January). A Temporal File System for Student's Assignments in The System Svetovid. In SQAMIA. URL

Sangalli, V. A., Martinez-Muñoz, G., & Cañabate, E. P. (2020, April). Identifying Cheating Users in Online Courses. In 
2020 IEEE Global Engineering Education Conference (EDUCON) (pp. 1168-1175). IEEE. URL

No comments:

Post a Comment