For teachers and lecturers, controlling students and students during remote exams is a real problem. For this reason, they often turn to special software, which can tell if any students have considered downloading it with the help of artificial intelligence. However, the problem started when the program provided incorrect information and reported fraud that did not actually occur.
According to reports edgeThis type of serious problem, especially Proctorio, is used by teachers to detect cheating attempts by students. The software relies on artificial intelligence to detect the faces of the students being examined. Appropriate algorithms can confirm the user’s identity, and the program itself also allows the use of blocking options to prevent the use of prohibited auxiliary materials. Proctorio provides clipboard disabling, forcing a full-screen mode or prohibiting the use of multiple screens to ensure that students will not actually cheat on the exam.
However, not everything is so beautiful, because according to the motherboard report, the software has a big problem in detecting faces of people whose skin color is not white. The Verge reports that students with other skin colors must take extreme measures to enable the software to recognize them. Otherwise, if Proctorio does not detect their face during the exam, it will automatically report the student’s attempt to cheat to the counselor.
Akash Satheesan decided to take a look at the working principle of the program and found that its appearance and function are the same as OpenCV. It is an open source software for facial recognition, and it is widely known for having problems recognizing people with different skin colors. The results of using OpenCV undoubtedly show that the program has many problems in identifying people, and Proctorio confirmed that it uses the functions of OpenCV.
Because of these problems, many students accused Proctorio of racism. When their academic future often depends on this kind of software, their frustration is not surprising. I still hope that the author of the program can remember the dissatisfaction of the students and try to improve the algorithm so that they can also cope with the recognition of images of people of different races.