Eproctoring horror story?
Want to tell Fight for the Future about your eproctoring horror story? We’d like to hear it.
Protect student privacy:
Tell colleges and universities that invasive proctoring apps don’t belong in education.
Popular eproctoring apps like Proctorio use racist algorithms that are less able to detect dark-skinned faces. Take our test below. Unlike with Proctorio, no data is collected by the widget we built here using the same computer vision software Proctorio uses.
We asked prominent colleges and universities if they plan to use eproctoring in the 2021 fall semester. Some were happy to provide a statement confirming that their school is not using eproctoring and WON’T USE it in the future. Others MIGHT USE, since they either failed to respond to our requests, or they issued a statement implying they might use this tech in the future. Even worse, many schools ARE USING invasive proctoring apps and plan to do so in the fall. Scroll down to see whether or not your campus is experimenting on you.
To School Administrators:
Invasive proctoring apps are endangering the privacy and the education of students worldwide. This experimental technology can grant access to all corners of students’ computers, gather biometric data, and use false science to decide if students are cheating or not.
These proctoring apps are largely unregulated and generally up to professors to implement. They unfairly put students’ ability to complete courses in jeopardy, expose their private information and images to abuse, and disproportionately impact Black and Brown students, students with disabilities, and low income students.
Inappropriate surveillance should not be a requirement for getting an education. We call on you to immediately make your policy firm and clear by banning proctoring apps from your institution entirely.
We, the undersigned, are calling for a ban on the use of eproctoring programs in K-12 schools and higher education.
Eproctoring programs are invasive, dangerous, and fail to prevent academic dishonesty. They demonstrate systemic bias against non-white students, are harmful for students with testing anxiety, and discriminate against students with disabilities. They also treat students as if they are guilty until proven innocent, which is a disrespectful and harmful stance for any academic institution to take.
Eproctoring programs that include facial detection, recognition, or monitoring are notoriously unable to identify students of color, particularly Black students. A federally-funded study found that even the best facial recognition algorithms fail to work on Black and brown people, trans and non-binary people, as well as children and women in general. As a workaround, there have been numerous accounts of students of color being forced to shine lights directly in their faces in order to be recognized by the software, which undoubtedly impacts testing performance.
Many online proctoring programs record students and their rooms through webcams. This surveillance is not only invasive, but flags even mundane actions as potential cheating, like if a student reads a question out loud (a common testing strategy for people who learn differently). Eye tracking may flag “too much movement” compared to a baseless expectation by the algorithm. This is harmful because frequent eye movement can easily be attributed to medical conditions, anxiety, learning differences, and/or neurodivergences like ADHD or autism.
Even eproctoring solutions that do not use a video component force students to surrender an unacceptable amount of control over their devices to a third party company—including browser histories, keystroke tracking, and the ability to change privacy settings. The databases these companies accrue on students have already been hacked and remain vulnerable.
Eproctoring companies are facing numerous lawsuits for compromising user data and misleading students about data collection practices, especially biometric data collection like face, voice, and fingerprints. Some students are suing their academic institutions for forcing them to use what amounts to glorified spyware in order to complete their classes.
For the equity and privacy of all students, school administrators must ban the use of eprocotoring.
Want to tell Fight for the Future about your eproctoring horror story? We’d like to hear it.
Surveillance companies know that their services schools are desperate for a solution to online testing. But experts warn that these spyware-like technologies place students in danger and increase systemic inequality in education. Proctoring apps say they monitor students completing tests or other coursework from home—but what they actually do is:
This is an abuse of the concept of consent and risks desensitizing people to surveillance. Eproctoring also treats students as if they are guilty until proven innocent, which is a concerning and disrespectful stance for any academic institution to take.
These surveillance tools are inappropriate, period. But they are also harming some groups more than others. Products that include facial detection, recognition, or monitoring demonstrate systemic bias against students of color, particularly Black students. There have been numerous accounts of students of color being forced to shine lights directly in their face in order to be recognized by the software, and this undoubtedly impacts testing performance. And low income students might not have high-speed internet or functioning computers to use to take these exams.
The eye tracking used by eproctoring services flag “too much movement” compared to a baseless expectation by the algorithm. There are numerous medical conditions that could be responsible for this behavior. Further, frequent eye movement can easily be attributed to anxiety, learning differences, and/or neurodivergences like ADHD or autism. Students with disabilities could get flagged for “suspicious” movements or experience increased anxiety in test-taking due to these monitoring measures. And there is plenty of data that indicates that testing anxiety negatively impacts performance, with eproctoring in particular associated with anxiety.
In August 2020, sensitive personal data of over 400,000 students leaked from ProctorU. Proctorio is threatening students and suing administrators who critique or expose their flawed and invasive features. ProctorU sent a cease and desist letter to a staff member who opposed its implementation. These companies are being pushed into schools by large educational publishing houses, who are also seeking to profit off the normalization of inappropriate mass surveillance. All these actions have made students afraid to express their valid concerns about these apps.
Eproctoring companies are facing numerous lawsuits for compromising user data and misleading students about data collection practices, especially biometric data collection like face, voice, and fingerprints. There are also students who are suing their academic institutions for forcing them to use what amounts to glorified spyware in order to complete their classes.
With eproctoring, algorithmic and human biases compound to create situations where the most marginalized and least represented students are at the highest risk of being accused of academic dishonesty. These accusations can have major consequences with limited recourse, and it places an undue and unnecessary burden on students. Recognizing these dangers, some campuses are discontinuing or strongly discouraging the use of eproctoring, including University of Michigan Dearborn, University of Southern California, and University of British Columbia.
Schools should not be using eproctoring. Surveillance tools treat students as if they are guilty until proven innocent, and won’t actually stop students who are committed to cheating. Professors can choose alternative ways to assess student performance without using invasive surveillance, and schools must end any existing contracts with remote proctoring providers and ban this technology from their campuses.