StMU Research Scholars

Featuring Scholarly Research, Writing, and Media at St. Mary's University
April 7, 2021

Online Exam Proctoring: The Dark Side of Student Privacy and Health

In a digital age, during a pandemic, virtual classes have exploded in demand. With virtual classes comes virtual assignments, quizzes, and exams, which has led to a concern for academic honesty. In order to make sure that academic honesty is enforced, universities have turned to online proctoring with proctoring software like Respondus’ LockDown Browser, in order to address it. The way that proctoring software operates, however, has made advocacy groups—professors, and students alike—concerned over the well-being of students and the collection of their personal data. Yet others are less concerned about how invasive proctoring software is becoming. California State University, Fullerton’s lecturer Randy Hoffman states, “there isn’t anything to worry about unless a student is trying to beat the system.”1 With students’ personal data or with their health, ProctorU Media Coordinator Franklin Hayes compares the concerns over student privacy to be like “having a cable guy at your house.”2 However, despite these dismissals and the software’s beneficial services, proctoring software brings major risks to student data privacy with the use of artificial intelligence and the data storage it requires, as well as risks to the well-being of students when they take exams.

First, let’s understand what proctoring is. Proctoring is how professors monitor students who take an exam in-class. Professors monitor the testing environment to make sure the exams are taken appropriately by each student. Online Proctoring (or e-proctoring) systems behave like regular proctoring; however, they differ between one another. Honorlock or Proctorio, for example, handle proctoring by having students install an add-on to their browser that controls what they can do in it, while LockDown Browser instead uses a custom browser that takes over the students’ computer screen. Each proctoring system “prohibits students from opening new tabs during [exams]” and has the ability to “[use] the student’s webcam to record a video while they complete the assessment and flag suspicious activity, such as background noise or the student’s eyes shifting away from the computer screen.” Professors or exam proctors can then review these video recordings for possible infractions.3 It sounds like normal proctoring, but just online; however, there are a few issues on how proctoring software works that raises concerns over student data and health.

Proctoring software makes use of computer webcams and microphones to make recordings. By doing so, the software can catch a student looking at a sticky note on their monitor, or capture voices that might provide answers to test questions. Without such software, professors can’t be sure whether the exam has been tampered with or not by a student. While they monitor students, the software uses additional “features” to validate the recordings, such as artificial intelligence (or AI), to determine whether the person taking the exam is really that student, and it will flag any suspicious activity throughout the session automatically. Some advocacy groups have argued that having students turn on their webcams and microphones to a stranger, or even to an algorithm “violate[s] their civil rights.”4 The Electronic Frontier Foundation (or EFF), a non-profit organization that focuses on data privacy, and Mutahar Anas, a YouTuber that does virus investigations and software analyses, both call proctoring systems “the equivalent to spyware.”5 California Community College Board of Governors member Colm Fitzgerald, who works at Delta College in Stockton, CA, told the university’s wire that these systems can be “extremely restrictive.” He related a story about a student who, while taking an e-proctored test, had his brother simply walk in the room, and when he turned around to tell him to leave, he was faulted for violating the testing rules, and failed his exam.6 This opens the door for a student to get their exam revoked by the AI due to someone walking into the room, forgetting that a exam is in session, and the AI thinking the student is cheating due to it.

The Respondus Monitor Terms of Usage as of January 21th, 2021 | Courtesy of Azariel Del Carmen on February 16th, 2021

Each proctoring software platform comes with a privacy policy that is shown to all students before their exams start. Most students have no choice but to just click “I agree” to the policy after glancing at the policy page; however, students who decide to read the legal terms of the policy in detail find out exactly how their data is being used. For Respondus’ LockDown Browser’s extension Respondus Monitor, Respondus states that “the recordings are controlled by your institution and will be processed by an agent of your institution, namely, Respondus, through its Respondus Monitor Services.”7 This suggests that a student’s webcam and microphone recordings are being handled by their universities; however, reading further, one comes to a section that states that “random samples of video and/or audio recordings may be collected via Respondus Monitor and [be] used by Respondus to improve the Respondus Monitor capabilities for institutions and students.”8 Because of this, Respondus has the authority to take a student’s recording and use it to improve Respondus Monitor despite the prior statement, which causes confusion as to what exactly they are doing with any and all students’ recordings.

These statements refer to the use of student video and audio recordings by Respondus for the AI algorithm behind Respondus Monitor. In order to catch and flag students for suspicious behavior, the AI algorithms needs to be “trained on thousands of video examples to recognize movements of eyes and head[s] that appear to correlate with [such acts].”9 This is done to catch cheaters and suspicious behavior efficiently, but students are unaware that their video recordings can be sent to the Respondus Monitor AI system if they don’t read the fine print of the Respondus Monitor “Terms of Use.” The students can’t refuse to send their recordings to Respondus unless they refuse to take the exam with it. However, such refusal will cause them to fail their classes or exams with no other alternative. Mutahar Anas in his video covering Respondus’ LockDown Browser states to his viewers, “This is wrong. This is immoral in my eyes,” and suggests to students to use another computer, dual-boot Windows, or make their universities get them a separate computer to avoid the invasive nature that proctoring software introduces to students’ computers for exam security.10

A point can be made that proctoring companies need this data from their software’s use in order for their AI systems to recognize new behaviors, but this can be resolved by having the student either opt-in or opt-out of data collection with no consequences prior to the exam starting. Other AI programs will ask users if they are OK with the software collecting their data, and if the users says no, they can still continue to use the software with no issues. Proctoring software, on the other hand, makes it mandatory to opt-in, and not agreeing to the terms can lock a student out of their exam. While this data collection makes sense, proctoring companies could add an option that would allow students to send their recordings to the companies, and let them take their exams. That option might limit how many samples the proctoring companies might get for their AI needs, but it should be sufficient to keep improving the algorithm, and might make them data privacy-compliant in the European Union and in the US. EU lawmakers have demanded an investigation to see “whether online proctoring software violates student privacy rights, arguing that it is unfair for universities to force students to use exam monitoring tools that capture their personal biometric data.”11 US senators Richard Blumenthal, Ron Wyden, and Chris Van Hollen wrote to ExamSoft, asking the company “how [student] data is being used before, during, and after tests, by [the company], the virtual proctors, and testing administrators” and how it complies with student privacy laws like the Family Educational Rights and Privacy Act (or FERPA).12 Even universities like the University of California at Santa Barbara “have advised against the use of test proctoring services in remote learning” due to student protests with proctoring systems and the “aggressive tactics employed by proctoring companies in response to those [protests].”13

Artificial Intelligence Vector Graphic | Courtesy of GDJ on Pixabay

There has been concerns that the AI algorithm behind proctoring software are biased and discriminatory to some students. Simon Coghlan, Tim Miller, and Heannie Paterson from the Melbourne Law School, in their paper state that “the field of AI Ethics (and, more broadly, digital ethics) is young and still under development” and “[has] occasionally been criticized for their lack of practical specificity and theoretical philosophical rigor.”14 Despite these issues, proctoring companies advertise AI as a feature that improves academic integrity to universities on their sites and advertising campaigns. Their AI algorithms however, has raised the issue that the technology is not ready for today’s needs and are biased against minority groups in the same way as other AI attempts in the past, such as Amazon’s AI recruiting tool, which excluded applicants who used the words “women” and the names of all-women’s colleges in their resumes to work for the company.15

Senators Richard Blumenthal, Ron Wyden, and Chris Van Hollen in their letter to ExamSoft tell the company that “students of color, and students wearing religious dress, like headscarves, have reported issues with the software’s inability to recognize their facial features, temporarily barring them from accessing the software.”16 While it may seem like a technical limitation, the proctoring companies advertise this as “elegant, functional, [and] powerful” and “eliminates human error [and] bias” with no evidence, despite past incidents.17 Ian Linketter, a learning technology specialist, posted on Twitter his analysis of Proctorio’s system. In the post, he explains that Proctorio’s system was invasive of students’ privacy and claims that “it discriminates against students who are marginalized, neurodiverse, or who otherwise do not fit the developers’ definition of normal.”18 Nir Kshetri, a management professor at the University of North Carolina, Greensboro explains to The Conversation that “simpler algorithms such as [AI] applications have been mainly trained to identify white males and, consequently, misidentify ethnic minorities,” and believes that AI isn’t appropirate for remote proctoring today, which has caused many students and advocates to wonder whether what the proctoring software are claiming with their AI system really is fact.19

Proctoring companies will state that each AI algorithm they have is trained differently from others, and are each adapting to new behaviors and different races and ethnicities; however, Simon Coghlan, Tim Miller, and Jeannie Paterson’s paper shows that the bias and discrimination exists in their AI algorithm. In their paper, they state that AI “has been criticized as inaccurate, and has [resulted] in legal action, despite the fact that [it] may have been trained on thousands or millions of images.”20 Even then, the proctoring companies themselves don’t disclose the AI assessments they make to backup their claims of bias, which raises a concern as to how the proctoring companies advertise this feature without actual data and past incidents that shown how it treats students of different religions, races, or ethnicities. False flagging by the AI can “harm [student]s’ opportunities often in arbitrary and discriminatory ways,” and students have no access to see any flags made during the exam or why they were made.21 Students would be unable to figure out what they are being flagged for and why beforehand, which would make it difficult to prove their case to their university or to their professor if the professor thinks that they cheated on an exam in class.

Mental Health Vector Graphic | Courtesy of GDJ on Pixabay

Proctoring software also raises concerns over a student’s well-being during the exam. While many of us may be able to live life as everyone else, others unfortunately have conditions that are not always ideal for some students. No one knows about a student’s condition before the exam starts, whether it is the professor, a random proctor, or AI algorithm. Due to this, many students with conditions like ADHD, trichotillomania, and chronic tic disorder for instance, have a hard time trying to take exams with these systems, compared to taking it in person. Ketelyn Vercher, a sophomore at Georgia State University in Atlanta who has ADHD, states to The Signal that “using LockDown Browser while having unmedicated ADHD has made it hard for [her] to take tests.”22 Proctoring software can also harm students psychologically. Nir Kshetri explains to The Conversation that “one student vomited [during her exam] due to the stress” and had to do it “at her desk because no bathroom breaks were permitted” during her exam.23 Proctoring software not only harms students mentally, but also harms those that have health conditions from taking their exams. An AI algorithm or proctor won’t be able to recognize all the effects someone with mental stress or with ADHD may have and can determine the behavior as inappropriate, hurting the student even further for cheating accusations. Mental health and health conditions are real, and the way that proctoring software treats each student can causes more harm than it does good for a letter grade on an exam.

Universities and proctoring companies will state that universities can accommodate students with health concerns; however, the solution they make shows that proctoring software is not necessary to be used in class. At St. Mary’s University in San Antonio, Texas, students with health conditions are given a few choices depending on where they are located. In-town students can be proctored in a testing center while out-of-town students can be proctored by another university in-person, and out-of-country students can be proctored via Zoom. The accommodation, however, gets rid of proctoring software as a whole, which raises the point of using proctoring software in the first place, if these options get rid of the proctoring system as a whole for these students, but can’t be offered to other students. The accessibility services know these systems can treat students with health conditions poorly and the course of actions that they take to make sure they are taking the exam fairly should be considered by professors and universities to handle exams for others students that doesn’t harm the student’s well-being throughout the testing period.

Proctoring software raises concerns over how it stores students’ information. As mentioned previously, these proctoring companies like Respondus can store student data to be used in AI algorithm testing, but they also store other kinds of personal information, like student IDs or addresses. Nicholas Nguyen, a senior at San Jose State University used ProctorU to take his exam and told his university wire that “he was asked to answer four personal questions about himself,” one being the name of a family member to prove he was actually himself. It’s unusual that a ProctorU proctor would ask that kind of question of a student, especially for Nguyen, who had never told anyone about this particular information. To clarify the confusion, Franklin Hayes explains that ProctorU got this information from “public records and databases by Acxiom, a data-brokering company,” and that this method is used as a security measure by some credit card companies.24 While the information is public, it raises questions as to what these proctoring companies are gathering on students and why they need brokering companies to prove that the student is actually that student, compared to a student image and ID.

ProctorU Screenshot From Twitter Confirming The Data Breach | Courtesy of Azariel Del Carmen on February 16th, 2021

The student information held by online proctors, however, has to be stored somewhere. Normally, this data is stored on servers hosted by the company, and is secured to prevent unauthorized access to the system; however, these system can fail, which has caused personal information to be publicly revealed. Students who have used ProctorU have gotten their data leaked recently on a hacker forum. BleepingComputer’s creator Lawrence Abrams states that the data leaked contained “email addresses, full names, addresses, phone numbers, hashed passwords, the affiliated organization, and [more]” from the University of Texas, Harvard, Yale, and even universities in other countries like Australia.25 ProctorU remained silent about this breach until users on Twitter and journalists like Abrams started to report it, with the company confirming the breach on Twitter in early August 2020, after someone tweeted an article by the University of Sydney, Australia talking about the breach.

ProctorU stated in a update that “data collected by ProctorU prior to March 2015 appeared to have been acquired and posted by an unauthorized user.”26 However, Abrams’s in his article analyzed the leaked database and found that it contained student account information up to 2017.27 Not only has ProctorU remained silent about this up till August 2020, but they didn’t even check that the data had student information up to 2017. Even if the data is three to eight years old, a data breach of this kind is still something serious to take a look at, especially a breach that leaks students personal information. Data breaches do happen all the time, from Yahoo in 2014 to Equifax in 2017; but the existence of this kind of information shows how much data these companies retain from students, years after they stopped using their services. And those data breaches may expose many of these students to identity fraud due to the lack of security on the system’s servers.

Other proctoring companies will state that they don’t gather and store the same type of information like ProctorU does. This is untrue, however, since Jason Kelley and Lindsay Oliver from the EFF tells us that “[proctoring companies] retain much of what they gather[,] whether that’s documentation or video of bedroom scans,” and that they have “no time limits on [data] retention” and may share it to third-parties.28 Respondus, for example, claims that “the recordings are controlled by [the student’s] institution,” but admits that “random samples of video and/or audio recordings…may be shared with researchers (research institutions and/or biometric experts) under contract with Respondus,” which the EPIC calls “a transfer of data which would necessarily include identifiable images of students’ faces,” which misleads students and institutions as to what it shares to third-parties.29 Students don’t have control over how their data is handled, despite the fact that the data shows the students’ face in it or asks the proctoring companies to delete it, because according to them they aren’t considered the data’s ‘owner.’ Rather their institution does.30

Proctoring companies have leaked student data before. A student on the social platform Reddit complained about their experience with Proctorio’s support team when the software kept crashing on their midterms. Proctorio’s CEO Mike Olsen, who read the student’s post, decided to post the student’s support logs in public, telling the student, “If you’re gonna lie bro . . . don’t do it when the company clearly has an entire transcript of your conversation.” It was “quickly panned as an inappropriate use of corporate data and an invasion of privacy” by other users, and the CEO later apologized and removed the post.31 Using student logs like Proctorio’s CEO did on Reddit shows not only how the company stores their information, but how it uses that information against them. A company is supposed to keep this data private, not post it publicly when it’s criticized and when they can use this information against the student like this, it’s unprofessional, and a violation of student’s data rights and laws like FERPA.

The Zoom Video Conferencing Logo | Courtesy of Zoom Video Communications, Inc.

With proctoring software causing more issues and concerns to students, professors, and activists than what it intends to fix, some universities are already looking at alternatives to handle their exams in different ways, with students leading the way with possible ideas. Alyssa Origer, a student at Delta College, suggests to professors that “if [they] want to make sure we aren’t cheating, they could just hold a Zoom meeting and monitor us themselves.” What this would do is allow professors to watch students take their exams just as if they were in-class and eliminate the issue of a random proctor or “a faulty [AI] algorithm deciding whether we are cheating or not.”32 Another solution is to make the tests and quizzes be short answer responses rather than multiple-choice. Making these kinds of exams allows professors to easily tell whether a student plagiarized their response, and makes the students put in actual effort for the exam topics to pass the exam. It won’t stop the use of Google and re-wording other people’s responses, but it can be easily shown whether or not a response was made with the use of the internet, someone else, or actual readings. One last option to consider is to assign a project to the class rather than a exam. Similar to short-answer questions, this solution will allow professors to check the quality of the work for any plagiarism with tools like Turnitin and make students put effort into the topic and the requirements the project needs. The professor can then review their work and plagiarism score and email students questions about their project to see whether the student had a understanding of the topic or violated academic honesty by using someone else’s work or the internet.

While the use of proctoring software is valid in today’s society and academic need, it is simply too much of a risk to students’ data and well-being than it is to maintain academic honesty in the classroom. “No student should be forced to make the choice to either hand over their [data] and be surveilled continuously or to fail their class.”33 Proctoring companies advertise that they prevent cheating, but a report by Michael N. Karim, Samuel E. Kaminsky, and Tara S. Behrend, at George Washington University in Washington, DC concluded that while it prevents cheating, “neither of the main effects were significant, suggesting there was no overall mean difference between the proctored and unproctored [exams].”34 Universities and schools that use proctoring software need to reconsider the options available to them and switch away from using such systems for exams and quizzes with something that can benefit, and be accessible to students more openly than a low cost fix for all exams that causes students to give up their data and risk their mental and general health.35 Until then, our current classes will still have to run these invasive proctoring systems that bring more negatives than positives to students over a simple letter grade in class for their future degree.

  1. Jared Eprem, “Exam Monitoring Platform Creates Privacy Concerns,” Daily Titan, September 27, 2020, https://dailytitan.com/news/campus/exam-monitoring-platform-creates-privacy-concerns/article_0f76ff90-0142-11eb-acb5-db7ebc361b7c.html.
  2. Nicholas Ibarra Yasmine Mahmoud, “Online Proctoring Raises Privacy Concerns,” Spartan Daily: San Jose State University, March 31, 2014.
  3. Hannah Workman, “Online Test Proctoring Services Cause Concerns about Equity, Privacy,” Collegian: Delta College, October 30, 2020, http://deltacollegian.net/2020/10/30/online-test-proctoring-services-cause-concerns-about-equity-privacy.
  4. Nir Kshetri, “Remote Education Is Rife with Threats to Student Privacy,” The Conversation, November 6, 2020, http://theconversation.com/remote-education-is-rife-with-threats-to-student-privacy-148955.
  5. Mutahar Anas, Why I Don’t Trust Online School Software…, 2020, https://www.youtube.com/watch?v=wgZlQbDY6QA.
  6. Hannah Workman, “Online Test Proctoring Services Cause Concerns about Equity, Privacy,” Collegian: Delta College, October 30, 2020, http://deltacollegian.net/2020/10/30/online-test-proctoring-services-cause-concerns-about-equity-privacy.
  7. Respondus, “Terms of Use – Respondus Monitor (Student),” Respondus, January 21, 2021, https://web.respondus.com/tou-monitor-student/.
  8. Respondus, “Terms of Use – Respondus Monitor (Student),” Respondus, January 21, 2021, https://web.respondus.com/tou-monitor-student/.
  9. Simon Coghlan, Tim Miller, and Jeannie Paterson, “Good Proctor or ‘Big Brother’? AI Ethics and Online Exam Supervision Technologies,” ArXiv:2011.07647 Cs, November 15, 2020, http://arxiv.org/abs/2011.07647, 4.
  10. Mutahar Anas, Why I Don’t Trust Online School Software…, 2020, https://www.youtube.com/watch?v=wgZlQbDY6QA.
  11. “EU Lawmakers Call for Online Exam Proctoring Privacy Probe,” Times Higher Education (THE), May 5, 2020, https://www.timeshighereducation.com/news/eu-lawmakers-call-online-exam-proctoring-privacy-probe.
  12. Richard Blumenthal et al., “Letter to Ed Testing Software Companies ExamSoft,” Senate Letter, December 3, 2020, https://www.blumenthal.senate.gov/imo/media/doc/2020.12.3%20Letter%20to%20Ed%20Testing%20Software%20Companies%20ExamSoft.pdf.
  13. “In the Matter of Online Test Proctoring Companies Respondus, Inc.; ProctorU, Inc.; Proctorio, Inc.; Examity, Inc., and Honorlock, Inc.” (The Electronic Privacy Information Center (EPIC), December 9, 2020), https://epic.org/privacy/dccppa/online-test-proctoring/EPIC-complaint-in-re-online-test-proctoring-companies-12-09-20.pdf, 5.
  14. Simon Coghlan, Tim Miller, and Jeannie Paterson, “Good Proctor or ‘Big Brother’? AI Ethics and Online Exam Supervision Technologies,” ArXiv:2011.07647 Cs, November 15, 2020, http://arxiv.org/abs/2011.07647, 3.
  15. “In the Matter of Online Test Proctoring Companies Respondus, Inc.; ProctorU, Inc.; Proctorio, Inc.; Examity, Inc., and Honorlock, Inc.” (The Electronic Privacy Information Center (EPIC), December 9, 2020), https://epic.org/privacy/dccppa/online-test-proctoring/EPIC-complaint-in-re-online-test-proctoring-companies-12-09-20.pdf, 11.
  16. Richard Blumenthal et al., “Letter to Ed Testing Software Companies ExamSoft,” Senate Letter, December 3, 2020, https://www.blumenthal.senate.gov/imo/media/doc/2020.12.3%20Letter%20to%20Ed%20Testing%20Software%20Companies%20ExamSoft.pdf.
  17. “In the Matter of Online Test Proctoring Companies Respondus, Inc.; ProctorU, Inc.; Proctorio, Inc.; Examity, Inc., and Honorlock, Inc.” (The Electronic Privacy Information Center (EPIC), December 9, 2020), https://epic.org/privacy/dccppa/online-test-proctoring/EPIC-complaint-in-re-online-test-proctoring-companies-12-09-20.pdf, 12-14.
  18. “In the Matter of Online Test Proctoring Companies Respondus, Inc.; ProctorU, Inc.; Proctorio, Inc.; Examity, Inc., and Honorlock, Inc.” (The Electronic Privacy Information Center (EPIC), December 9, 2020), https://epic.org/privacy/dccppa/online-test-proctoring/EPIC-complaint-in-re-online-test-proctoring-companies-12-09-20.pdf, 14.
  19. Nir Kshetri, “Remote Education Is Rife with Threats to Student Privacy,” The Conversation, November 6, 2020, http://theconversation.com/remote-education-is-rife-with-threats-to-student-privacy-148955.
  20. Simon Coghlan, Tim Miller, and Jeannie Paterson, “Good Proctor or ‘Big Brother’? AI Ethics and Online Exam Supervision Technologies,” ArXiv:2011.07647 Cs, November 15, 2020, http://arxiv.org/abs/2011.07647, 6.
  21. “In the Matter of Online Test Proctoring Companies Respondus, Inc.; ProctorU, Inc.; Proctorio, Inc.; Examity, Inc., and Honorlock, Inc.” (The Electronic Privacy Information Center (EPIC), December 9, 2020), https://epic.org/privacy/dccppa/online-test-proctoring/EPIC-complaint-in-re-online-test-proctoring-companies-12-09-20.pdf, 11.
  22. Callie McNorton, “LockDown Browser Is an Invading Privacy,” The Signal (blog), October 27, 2020, https://georgiastatesignal.com/lockdown-browser-is-an-invading-privacy/.
  23. Nir Kshetri, “Remote Education Is Rife with Threats to Student Privacy,” The Conversation, November 6, 2020, http://theconversation.com/remote-education-is-rife-with-threats-to-student-privacy-148955.
  24. Nicholas Ibarra Yasmine Mahmoud, “Online Proctoring Raises Privacy Concerns,” Spartan Daily: San Jose State University, March 31, 2014.
  25. Lawrence Abrams, “ProctorU Confirms Data Breach after Database Leaked Online,” ProctorU Confirms Data Breach after Database Leaked Online, August 9, 2020, https://www.bleepingcomputer.com/news/security/proctoru-confirms-data-breach-after-database-leaked-online/.
  26. “Security Update for ProctorU Clients,” ProctorU, accessed February 16, 2021, https://www.proctoru.com/security-update.
  27. Lawrence Abrams, “ProctorU Confirms Data Breach after Database Leaked Online,” ProctorU Confirms Data Breach after Database Leaked Online, August 9, 2020, https://www.bleepingcomputer.com/news/security/proctoru-confirms-data-breach-after-database-leaked-online/.
  28. Jason Kelley and Lindsay Oliver, “Proctoring Apps Subject Students to Unnecessary Surveillance,” Electronic Frontier Foundation, August 20, 2020, https://www.eff.org/deeplinks/2020/08/proctoring-apps-subject-students-unnecessary-surveillance.
  29. Respondus, “Terms of Use – Respondus Monitor (Student),” Respondus, January 21, 2021, https://web.respondus.com/tou-monitor-student/; Respondus, “Terms of Use – Respondus Monitor (Institution),” Respondus, 2020, https://web.respondus.com/tou-monitor-admin/; “In the Matter of Online Test Proctoring Companies Respondus, Inc.; ProctorU, Inc.; Proctorio, Inc.; Examity, Inc., and Honorlock, Inc.” (The Electronic Privacy Information Center (EPIC), December 9, 2020), https://epic.org/privacy/dccppa/online-test-proctoring/EPIC-complaint-in-re-online-test-proctoring-companies-12-09-20.pdf, 7.
  30. Jason Kelley and Lindsay Oliver, “Proctoring Apps Subject Students to Unnecessary Surveillance,” Electronic Frontier Foundation, August 20, 2020, https://www.eff.org/deeplinks/2020/08/proctoring-apps-subject-students-unnecessary-surveillance.
  31. “In the Matter of Online Test Proctoring Companies Respondus, Inc.; ProctorU, Inc.; Proctorio, Inc.; Examity, Inc., and Honorlock, Inc.” (The Electronic Privacy Information Center (EPIC), December 9, 2020), https://epic.org/privacy/dccppa/online-test-proctoring/EPIC-complaint-in-re-online-test-proctoring-companies-12-09-20.pdf, 9.
  32. Hannah Workman, “Online Test Proctoring Services Cause Concerns about Equity, Privacy,” Collegian: Delta College, October 30, 2020, http://deltacollegian.net/2020/10/30/online-test-proctoring-services-cause-concerns-about-equity-privacy.
  33. Jason Kelley and Lindsay Oliver, “Proctoring Apps Subject Students to Unnecessary Surveillance,” Electronic Frontier Foundation, August 20, 2020, https://www.eff.org/deeplinks/2020/08/proctoring-apps-subject-students-unnecessary-surveillance.
  34. Michael Karim, Samuel Kaminsky, and Tara Behrend, “Cheating, Reactions, and Performance in Remotely Proctored Testing: An Exploratory Experimental Study,” Journal of Business and Psychology 29, no. 4 (December 1, 2014): 555–72, 9.
  35. Callie McNorton, “LockDown Browser Is an Invading Privacy,” The Signal (blog), October 27, 2020, https://georgiastatesignal.com/lockdown-browser-is-an-invading-privacy/.

Azariel Del Carmen

Azariel Del Carmen is a student majoring in Computer Science for the Class of 2024 at St. Mary's University of San Antonio. He loves to learn and hear new things relating to technology. He is inspired by what he does, others he looks up to, and the person he wants to become for the future in his life and goals.

Author Portfolio Page

Recent Comments

94 comments

  • Santos Mencio

    A fantastic article about an issue that seems to be too often ignored. I personally have never liked services like Lockdown Browser. I feel that they violate my privacy. The idea that audio and video footage of me, my personal information, or my room could be seen by someone that I know nothing about is paralyzing. While I do disagree that AI could discriminate against minorities this article overall does a great job talking about an issue facing students in the modern world.

    • Azariel Del Carmen

      Hi Santos. While I do get their use cases and that here at St Mary’s it is normally our professors that watch our exams, it still feels like a breach of privacy is being made when your only option to take an exam in one class is giving up your privacy of your home, dorm or anywhere in the general vicinity. I understand you disagree with my AI take of the article and I want to say that not every system is perfect to recognize everyone well as others have said in the article and comments below. The fact that they advertise their systems as a feature that can be relied on fully is something that shouldn’t exist without concrete evidence and given student reports of having to be in bright light for some exams to not detect those of different religions by their clothing is something that I find problematic and makes the whole thought of their AI systems good as the EPIC complaint cites it get thrown out the window.

  • Camila Garcia

    This article is really interesting. I’ve always found lockdown browser to be a pain, but now after realizing how intrusive it is I think it’s something schools should reconsider. I had no idea that the software stored the data it collected. I’ve never paid attention to the privacy policy because I never found it important, but now I realize that I should have.

    • Azariel Del Carmen

      Using proctoring software is a pain indeed. Everyone has to deal with it’s complex setup process along with glitches within the software or AI, unexpected interruptions and more that can affect any person and exam they are taking in many different ways. I do agree that schools need to reconsider their solutions towards academic integrity and I’m glad you found it interesting and realized the important details about this article as a whole.

  • Hailey Lechuga

    This was a fascinating article. I have always thought of the lockdown browsers that record your image and audio to be extremely intrusive. Although I understand the frustration that these institutions must face with the drastic inflation of the opportunities to partake in academic dishonesty in the virtual school setting, the fact that I am being recorded while I am already nervous about taking the exam itself brought a lot of anxiety. Even so, I had never considered how this information is being stored. The fact that even AI we have created will discriminate against people of color and excludes other populations like neurodivergent students is very concerning and we must get better at inclusion of all types of people in all aspects of society.

    • Azariel Del Carmen

      I totally agree. LockDown Browser which we use here I have never heard of until it was pushed in my high school for one class. I get the reason they are used but this is just not the way to go for academic honesty. There are better ways to monitor exams than just proctoring software itself which some professors are using like Zoom monitoring in-class time, or in-person exams one time last semester and the alternatives listed here that prevents all of this can happen here by the university and professors. I hope they do realize these effects by what other students are saying about it and consider other options for online exams than stick with these kinds of software.

  • Monserrat Garcia

    Extremely interesting and fascinating article! I had never thought of the consequences of lockdown browsers when taking online exams… I have not had that many tests online that require any sort of supervision but if I did I would relate more with the students who feel violated depending on what resource they use to keep accountability… I also really enjoyed the way you incorporated people’s stories and mindsets from all over the US. Thank you for this information!

    • Azariel Del Carmen

      Hi Monserrat. I never thought of the issues with Lockdown itself originally back in the fall semester until I learned more about this in my research, mainly Mutahar Anas’ video covering proctoring software which specifically analyzed Lockdown Browser itself. I do feel like my privacy when I take an exam is being violated as I can’t even consent to data collection or to take it in a different manner and by what I learned about this and past experiences with Lockdown Browser in other classes, made me trust them less and less and feel like I am just someone to be watched for ulterior motives.

  • Allison Grijalva

    Hi Azariel! This was a great topic because of how relevant it is to students all over the world amidst the COVID-19 pandemic. While online learning does have benefits, I appreciate you researching a negative side to it that is not always discussed. Weighing the importance of student’s privacy and safety with the monitoring of students during online exams is a fine line. While there is a need to facilitate a classroom like setting via online for exams, these proctoring systems seems to be incredibly invasive with student information. While we slowly but surely move back to in person instruction, my hope is that online exams will be mitigated.

  • Yousef Alghamdi

    Hi Azariel,
    This is such a well-written paper that is related to our reality nowadays. The article opened my eyes to the consequences of over-relying on online education. I did not know that privacy is a related issue to online education programs, I thought they are extremely safe. I think we have to develop this side of technology because we will not be able to give it away, it is already part of our life.

    • Azariel Del Carmen

      The reliance of online education I completely understand due to pandemic and/or regional reasons but giving up your privacy on a exam just for a grade with no options of managing data or other test procedures is something I don’t agree with. I thought they were safe too until I learned about ProctorU and it begs to question if we can even trust others with our data; either held with them or by a institution.

  • Valeria Varela

    Your article was very informative on the topic. I actually didn’t know how much of a risk we were putting each other in when using these kinds of software’s. I didn’t put much thought into it when I was using them because at this point I just think we’re programmed into thinking that these kinds of software’s are normal or something we have no choice but to accept as normal. But after reading your article I do agree that universities and colleges need to look at other alternatives other than quick “fixes” that not only risk student data but affect their mental health negatively.

  • Edward Cerna

    What made me click on the article was the title since it is currently what we as students are dealing with. This was a very well written article that educated me a great deal about something I knew about but did not know how it operated. I found it alarming that the AI can be discriminatory towards certain groups of students. When I use programs like this for my classes none of this stuff initially came to my mind. This article covered many aspects of these programs.

    • Azariel Del Carmen

      I’m glad this article made you aware about how proctoring systems used by universities and other campuses affect us as a whole. AI has been a place where it’s still too new to rely on still and how people describe their experiences with these systems which talks about how much they have to fight the AI algorithm to see their face is something a student shouldn’t deal with for their exams. A student needs to focus on their exam. Not on the software and exam at the same time.

  • Edgardo Llosa

    I felt really identified with this article the amount of stress this new online system gives students is much higher than normal classes even though you have the commodity of taking an exam on your leaving room. As well this article opens your eyes to the amount of private information these new pages and applications require and it makes you feel that everyone can see you even though you are at your house.

    • Azariel Del Carmen

      I agree. These systems just add a level of stress to students that are just wanting to get an exam over with and worry about other things in their life or other classes. The information of students that they collect is almost unneeded. They just need the basic school credentials to log you in to Canvas and such and they shouldn’t just randomly pick your recording to be trained on their AI algorithms without your explicit permission. Agreeing to the terms isn’t agreeing to data consent. It’s be collected or fail your course.

  • Madeline Chandler

    This is such a relevant article to speak about in the midst of online education. It is truly such a fascinating and honestly frightening thing that education and technology systems are exploiting students privacy. I understand the need for honesty policy but I think there needs to be a system in place that does not infringe on privacy. Great job!

    • Azariel Del Carmen

      I agree. There are many alternatives that professors and universities can use that don’t rely on these systems that just exploit students information. Zoom proctoring seems to already be used by some professors in different ways and a few of them have started to give exams out in-person. I get the need to make the exams fair but this behavior of data collection is something that should be of concern to many students before their exams or quizzes, with proctoring software usage or not.

Leave your comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.