Celebrate Excellence in Education: Nominate Outstanding Educators by April 15!
Found this content helpful? Log in or sign up to leave a like!
There have been several posts asking questions about test logs and cheating. The standard Canvas support answer seems to be that the analytics are not intended to be used to detect cheating. My question is why not!!!!
As an example I had a student enter a 250 word answer and the time log indicated that answer was inputted over about 30-seconds. That would be a typing speed of 500 words per minute, which is basically physically impossible. Expert typing speeds are indicated as about 120 wpm so a 500 wpm input would certainly suggest that the answer was pasted in.
Why does Canvas support discourage us from using the analytics present in the test logs for identifying some instances of cheating similar to my example?
One concern in the timing issue is that, while we can probably say that someone spent just 30 seconds on that question screen, we can't really determine where they got their answer from or what they were doing before they answered. In some cases, it is possible that students had their own prepared text available to copy from. I can imagine a scenario where an instructor gives students the heads up on the 5 possible essay questions they may encounter in the exam, and the students prepare their essay ahead of time. Or the student passes the essay question on their way through the exam, and before coming back to it, composes the answer in their laptop notepad and then returns to the essay question screen to copy in the answer. We just can't prove where the answer came from, and that's the missing piece. Your scenario may be pretty clear-cut, but there are many ways to present an essay question between K-12 and higher ed that can muddy the waters on this single data point.
That being said, the details of time spent on a question is a valuable indicator of possible cheating, and should also include consideration of the student's previous work quality and engagement in the class. It's an indicator but maybe not an indictment.
I often suggest to students that if they're responding to an essay question or discussion post with a longer answer that they compose it in a word processor and then copy/paste into Canvas. This protects their work on the off chance that there's a technical issue with the submission and it doesn't go through. If they'd been composing in Canvas then they could potentially lose their work. This doesn't happen very often, but it does come up a time or two each semester. My guess is that when this happens it's either a Canvas time-out issue or a problem with the student's Internet connection.
-Brett
Laura, but for my exams students do not have advance knowledge of the essay questions and, unless they have dishonestly acquired an advance copy of the exam, they should not have prepared answers in advance. If they create an answer that has more words than can be physically typed in for the time spent on the question then it had to have been pasted in from an outside source or as a pre-prepared answer. Both would be cheating if there was no advance knowledge of the questions. I understand there are circumstances where the wpm could be acceptable but how about for exams where that is not the case. That is my situation....students do not have advance knowledge of the exams, which is clearly different to the situations you describe.
But, if the student composes an orignal answer in a Word document or on Google docs, etc and the copy and pastes it in Canvas, that's not cheating. I understand the frustration of having no way to know where or how the student composed their answer, but how quickly an answer is 'typed' is really an indicator of nothing. And yes,
Canvas can be glitchy and lose answers. Students who compose their answers on another platform are actually being smart about protecting their work and time, not cheating.
100% agree with this.
I hate cheaters as much (or probably more) than the average person, but I also don't think it's a great practice to over-accuse either. There are too many edge cases that can come into play, which I think is why Canvas puts up their caveat that logs/analytics shouldn't be used around academic integrity issues, there's just no certainty to the cause of the activity that may or may not be logged. I'm sure some students are doing the wrong thing, but for students who are doing nothing wrong, being falsely accused can have a lot of very negative consequences in the moment and long-term.
-Chris
I understand and am not completely unsympathetic to the idea of students crafting exam answers during an exam in a word doc and pasting the answer in for reasons described in this thread. However, this leaves us all vulnerable to those, hopefully, relatively few students that will cheat. In my experience, teaching at the college level for 13 years, if we do not have some way to monitor, catch, and possibly penalize cheaters then what inevitably happens is that the number of students that will cheat, even if only "slightly", will increase. It becomes a known strategy and acceptable to some of the students. I have had this happen in the past which caused me to make a major change in how I handled assessments. I am now planning to include language in my exam instructions that all answers must be typed into the essay question fields...no exceptions. If a student wants to save an answer for future reference they can copy that answer into a word document rather than copying from a word doc into a test field.
In one of my current self-paced online courses, my retrospective analysis flagged about 30% of my students as potentially cheating by pasting text into an exam. My criteria included more than that an answer was simply pasted in, for example: was the answer language consistent with other answers, was there enough time spent off the exam that a student could have written the answer, etc. I believe the data is there in the test log analytics that allows us to rule in/out some types of cheating. I also think we have to use those analytic tools carefully and always give students the benefit of doubt. But, I am very, very worried about academic integrity in our online courses whether they are live online lecture formats or self-paced courses that have recorded lectures.
I appreciate your concern about assessment integrity, however, I would strongly advise against this approach as it leaves students open to losing work in Canvas, it doesn't happen often but it does happen and this disadvantages students. If you are this concerned about cheating it would be better to have invigilated pen and paper exams, these have much greater assessment integrity than doing any type of assessment online. Online exams also make it extremely easy for students to collude during the examination.
I appreciate this question, Geoffery, because as a grad student I often made the choice to compose my Canvas discussion posts in a different program (like Evernote or Google Docs) and then paste and reformat the text in Canvas. I did this for a few reasons.
First, it was helpful for me to keep a record of my thoughts as they evolved throughout the course, and referring back to them was much easier in a Google Doc than by going to individual discussions. Second, this approach prevented against the possibility of losing my post in Canvas, which happened more than a few times.
There would have been no way for my teachers to know about my approach to drafting/writing, unless of course they asked me directly. The time logs, alone, did not tell the full story of my student experience. I don't know whether or not your student used a similar approach, or if they copied their text from some outside source. But copying and pasting by itself isn't necessarily evidence of plagiarism.
I hope me sharing this is helpful. I'm new to this community forum and I'd like to be as supportive of other folks in education as I can be.
In peace,
Matthew
Matthew, thanks for your thoughts. I am not talking about discussion posts or other essay-type, untimed assignments. I am specifically concerned about timed exams where pre-prepared answers would not be acceptable. I am simply trying to require the same standards that we would use if it was an old-school hard-copy exam.
Whilst I understand the concerns around this. The same standards cannot apply to timed online exams, it's simply not possible without egregious invasions of student privacy using online proctoring tools and these tools can also be subverted by the student. The exam is no longer a viable option in the age of AI and online learning, Multiple choice and online learning quizzes etc. are only suitable for formative learning tasks online, not summative assessment. We need to re-think assessment approaches or go back to the way we did things pre online learning.
https://blogs.deakin.edu.au/cradle/2023/01/16/1300-years-is-long-enough-its-pens-down-for-the-exam-h...
I appreciate the clarification, Geoffrey. As it turns out, I took a series of 6 timed essay exams in which I wasn't given the prompt for the essay until the top of the writing time (a 3 hour window). Sort of like Comps, but not. And I wrote my entry in the fashion I described above. Working in my own platform and copying my entry into the testing platform still was the safest and most sensible approach to me.
There are a number of answers in this thread that debate options. One thing we have found in looking at analytics is a caveat in that students can open more than one browser during an exam. This may have changed but the first browser tracked and saved student input where the second did not. If a student is required to enter answers in the text box and they do half in one browser and half in another then their answers are not automatically saved in both.
Most word type products also have the meta data on when things were started and saved. We have had more than one example of a student being able to provide the file and we can see that they actually did type and cut and paste.
We have instructors who use slightly differently worded questions like changing the name of the company or numbers slightly so they can identify a cut and paste answer. Using the data of a typing speed impossibility is not a smoking gun, but it does warrant further discussion with student.
The best answer to this is probably that the tools you are referring to provide analytical data, and the most common uses of that data are more wide-ranging than the detection of cheating, combined with the fact that none of the tools can provide guaranteed proof of cheating.
While the data can be used to detect anomalies or irregularities, and that may be evidence gathered in deciding whether or not cheating happened, one also has to acknowledge that it takes more investigation than just the existence of an irregularity to prove cheating. At the very least, there would need to be interviews with the student and anyone else who may have knowledge of the incident or its plan. And in the end, even if cheating is proven, it still wouldn't be accurate to characterize the purpose of the analytics tools as being "for the purpose of detecting cheating," when they are used for so many other purposes on a far more consistent basis.
For example, when students at a specific school are using timed writings to build speed and fluency in preparation for the free response sections of a high-stakes exam, there will be several hundred uses of the "time spent" feature in a single day, for a single course, and not a single one of those uses will be related to the detection of cheating. And that is just one of the analytics tools, on one assignment, in one course, in one school, on one day.
It would be fair to say that evidence towards the detection of cheating can be a fringe use of some of the analytics tools, but this is not their most common use, not their purpose, and not, by itself, enough to stand as proof of cheating.
The answer is the presence of HTML in an answer means either the student knows HTML and chose to use it to enhance their answer or they copy/pasted.
Some HTML is expected - but it should result in a polished display. If the answer is difficult to read or poorly formatted, the most likely explanation is that the student copy/pasted.
Most browsers have a web inspector (see list at the end) which allows you to see the code behind a student's answer.
The image below shows an answer that was pasted from a text editor - there is no HTML in the content, it is a paragraph (the <p>) with line breaks (<br>s).
This should be read as a student's original work, or the student copied from some other source, pasted it into a text editor to clear the HTML and then into Canvas. The only way it could be considered plagiarism is if the text was an exact match to unique text elsewhere and no resource was cited.
Some tags and styling must be allowed, the answer below was copy/pasted from IntelliJ - a tool for writing Java and other programming languages. The consistent font size, uniform colors for keywords and the presence of style="font-family: 'JetBrains Mono..." is expected.
Content copy/pasted from the internet often looks different and includes distinctive patterns in the underlying HTML. This text was copied from https://www.geeksforgeeks.org/java-hello-world-program/#
Note the number of HTML tags (<div> and <code>) and the presence of class attributes (class="line number1 index0 alt2")
Inspecting the GeeksForGeeks site in the same manner shows the same pattern (<div> and <code> tags), with similar class attributes (class-"line number7 index6 alt2")
If an essay answer is formatted differently (font family, color, size, spacing) using a web inspect will likely yield helpful information.
Common text emphasis with the Canvas editor includes (but isn't limited to)
Inspectors
To participate in the Instructure Community, you need to sign up or log in:
Sign In