This Student Is Taking On ‘Biased’ Exam Software


Porcornie’s legal case is still ongoing. In December, the Dutch Institute of Human Rights issued an interim ruling saying it strongly suspected that the software used by VU Amsterdam was discriminatory and giving the university 10 weeks to file its defense. That defense has not yet been made public, but VU Amsterdam has previously argued that Pocornie’s log data—showing how long she took to log into her exam and how many times she had to restart the software—imply her problems were due to an unstable internet connection, as opposed to issues with the face detection technology. A ruling is expected later this year. 

Producers of anti-cheating software like Proctorio’s were boosted by the pandemic, as exam halls were replaced by students’ own homes. Digital monitoring was meant to help schools and universities maintain business as usual throughout lockdown—without creating an opportunity for unsupervised students to cheat. But the pandemic is over and the software is still being used, even as students around the world return to in-person teaching. “We don’t believe it is going away,” said Jason Kelly, who focuses on student surveillance at the US-based Electronic Frontier Foundation, in a 2022 review of the state of student privacy in December.

In the US, Amaya Ross says her college in Ohio still uses anti-cheating software. But every time she logs in, she feels anxious that her experience during the pandemic will repeat itself. Ross, who is Black, also says she couldn’t access her test when she first encountered the software back in 2021. “It just kept saying: We can’t recogize your face,” says Ross, who was 20 at the time. After receiving that message three or four times, she started playing around with nearby lamps and the window blinds. She even tried taking a test standing up, directly underneath her ceiling light. 

Eventually she discovered that if she balanced an LED flashlight on a shelf near her desk and directed it straight at her face, she was able to take her science test—even though the light was almost blinding. She compares the experience to driving at night with a car approaching from the other direction with its headlights on full-beam. “You just had to power through until it was done,” she says. 

Ross declines to name the company that made the software she still uses (Proctorio has sued at least one of its critics). But after her mother, Janice Wyatt-Ross, posted about what happened on Twitter, Ross says a representative from the business reached out, advising her to stop taking tests in front of white walls. Now she takes tests with a multi-colored wall-hanging behind her, which so far seems to work. When Ross asked some of her Black or darker-skinned friends about the software, a lot of them had experienced similar problems. “But then I asked my white friends and they’re like, ‘I’m taking tests in the dark,’” she says. 

Typically, face-recognition and detection technology fails to recognize people with darker skin when companies use models that were not trained on diverse data sets, says Deborah Raji, a fellow with the Mozilla Foundation. In 2019, Raji copublished an audit of commercially deployed face-recognition products, which found that some of them were up to 30 percent worse at recognizing darker-skinned women than they were white men. “A lot of the data sets that were in mainstream use in the facial recognition space before [2019] contained 90-plus percent lighter skin subjects, 70-plus percent male subjects,” she says, adding progress has been made since then, but this is not a problem that has been “solved.” 





Source link

Kobo’s newest Elipsa 2E ereader is ready to take down the Kindle Scribe Previous post Kobo’s newest Elipsa 2E ereader is ready to take down the Kindle Scribe
Biden says U.S. must address ‘potential risks’ of AI Next post Biden says U.S. must address ‘potential risks’ of AI