The Problem of Exam Surveillance Software
Exam surveillance software raises a host if issues, from technical problems to outright racial bias. In short:
- They require a laptop or regular computer: students using phones or low-powered computers can’t use them.
- They’re expensive: the software we piloted last year cost $5.50/exam/student. A class of 30 students, each taking 3 exams, would therefore cost almost $500 to proctor.
- They’re invasive: students are required to show video of their homes throughout the exam.
- They’re biased, with documented impacts on students of color and students with disabilities.
- They’re ineffective. (Just search on “How to cheat using ____name of tool___”.)
Below are some details to consider when evaluating this software.
Technical requirements
The tools don’t work on tablets and phones, and we know that a sizable chunk of our student body uses that as either primary or only access to the internet. For instance, Pew Research asked What percent of adults "do not use broadband at home but own smartphones? Links to an external site."
- 22% of 18-29 year olds
- 25% of Hispanics
- 23% of Blacks
- 12% of Whites
- 26% of those who make less than $30,000/year
- 24% with high school diploma as their highest education
Notice two things about these stats: (1) how closely it describes our student body, and (2) how big a gap there is between white students and black and hispanic students.
While we’ve managed to distribute a fair number of laptops, hotspots are limited. Even with a minimum technology requirement in place for the course, students are cobbling together access in ways where they aren’t able to use the software.
Cost
As noted above, the software we piloted last year cost $5.50/exam/student. A class of 30 students, each taking 3 exams, would therefore cost almost $500 to proctor. Add more exams, and the cost goes up. The college budget is unable to support this at this time, and passing fees to students doesn’t seem like a great approach at this point either.
Invasiveness
Exam surveillance tools require that students take the exam in a quiet location where there are no other people (or pets). The software will report students if there is background noise, or if movement is detected in the room where the computer/webcam is located. Some tools monitor the traffic on that home’s internet connection, so if someone else is googling something related to the exam, it’s reported. In other words, if I’m taking an astronomy test and my kid is upstairs checking out the Mars rover, I’m flagged.
In addition, the tools record the audio and video for the room throughout the session, and retain it.
Consider the following scenarios:
- Students who have things going on at home that they don’t want recorded, for instance an elder with dementia, or an undocumented relative living with them.
- Students who either wear or have family members who wear head covering when in public; their families will have to cover during the exam, even though they're at home.
- Students who are unwilling to be recorded for religious reasons.
Some students may just have objections to the privacy invasion just on principle. One nearby college reported that issue caused students to drop classes.
Bias
Both standard practices for using the systems and the algorithms for detecting cheating are sources of bias. For instance:
- Most systems require students show a government issued ID, such as a driver’s license, military ID, or college ID card. Undocumented students and younger students may not have these.
- Students with dark skin are often asked to change their lighting so the system can better monitor them;
- Students with an illness or disability are flagged for cheating based on normal behavior, for instance dyskinesia (uncontrolled body movements) or eye-tracking disabilities.
The feedback on accessibility testing for all of the key vendors in this market is mixed. The additional tech overhead, plus the pressure/interruptions created by the AI and human proctor interrupting the test, would need to be factored in to extended time or other accommodations. Students who have disabilities related to executive function and focus will find additional interruptions more challenging.
Effectiveness
It does not prevent all forms of cheating. For instance, a recent Google search on “how to cheat with Honorlock” yielded the following:
If you insert the name of another tool into that search query, you’ll see more options.
Even the identification requirement is easily defeated. We don’t have first-hand experience with this, but it’s been said that in the 1990s students made passable fake IDs to get access to alcohol with poster board and Polaroid cameras. Modern image editing software would certainly make this process easier.