Privacy SOS

No, facial recognition will not stop school shootings. Here’s why.

A school district in New York has flipped the switch to activate a facial recognition system across hundreds of cameras in its school buildings, drawing sharp criticism from civil rights advocates and concerned parents and students. The move comes as school administrators across the country grapple with how to prepare for and prevent rare but devastating incidents of gun violence. Unfortunately for the students, workers, and visitors at Lockport, New York schools, this facial recognition system will not protect anyone from a school shooting. The system is much more likely to be used to police minor student misbehavior, like cutting class, graffiti, or disorderly conduct. Even worse, it could lead to acute harms for students wrongfully caught up in its surveillance dragnet, a problem much more likely to impact students of color, particularly Black students.

The New York Times describes how the system will function, according to school officials:

[The hot] list includes sex offenders in the area, people prohibited from seeing students by restraining orders, former employees who are barred from visiting the schools and others deemed “credible threats” by law enforcement.

If the software detects a person on the list, the Aegis system sends an alert to one of 14 rotating part- and full-time security personnel hired by Lockport, Mr. LiPuma said. The human monitor then looks at a picture of the person in the database to “confirm” or “reject” a match with the person on the camera.

If the operator rejects the match, the alert is dismissed. If the match is confirmed, another alert goes out to a handful of district administrators, who decide what action to take.

The technology will also scan for guns. The chief of the Lockport Police Department, Steven Abbott, said that if a human monitor confirmed a gun that Aegis had detected, an alert would automatically go to both administrators and the Police Department.

There are numerous reasons why a system like this will not stop school-based violence.

First, any hot list of suspicious persons is unlikely to contain the image of a school shooter. After all, in the vast majority of cases police and school officials do not know someone is potentially dangerous, or planning a school shooting, before the shooting happens.

Second, even if the system by some miracle did include an image of a person suspected to be dangerous, the technology would not stop that person from harming people. The school district says alerts go to “one of 14 rotating” security personnel, who “then look at a picture” to determine whether or not the person is on the suspicious list. If the person is indeed a “match” to the hot list, the security person sends yet another alert to higher ups in the administration, who determine whether or not to take further action. It’s not clear how long this process takes, but it’s likely that it will take significantly longer than a shooter needs to pull a trigger. By the time someone is shooting up a school, anyone with a cell phone nearby is likely to dial 911. In other words, it’s not clear what purpose the technology would serve during a school shooting. After all, the surveillance camera is not programmed to jump off the wall to attempt to physically pacify someone with a gun.

Third, we don’t know whether or not the system will function as designed. A recent study by the National Institute of Standards and Technology found most face recognition algorithms perform more poorly when attempting to identify people of color, women, children, and the elderly. That’s a large majority of the people walking around any school on a given day.

If the system produces false positives, alerting administrators to the presence of a supposedly dangerous person who is not in fact that person, people could be at risk of false arrest, or worse—a particularly serious threat for Black students, who according to the Times already face disproportionate discipline in Lockport. And even if this worst-case scenario never unfolds, the mere existence of the technology contributes to a creeping carceral atmosphere in public schools, where students are increasingly viewed as subjects to be managed and controlled.

There’s also the problem of mission creep. The system is currently programmed not to track students, but the district’s technology director wants that to change. “The frustration for me as a technology person is we have the potential” to stop a school shooting, he told the Times, but the state board of education refuses to allow him to add suspended students to the list.

A few years ago, social media monitoring of student speech online was the snake oil “security” tech of the day, advertised to spooked school administrators as a cure all for problems ranging from suicidal ideation to school shootings. Today it’s facial recognition. But no matter what the companies trying to sell these products claim, neither of these technologies will stop violence in our schools.

Slapping the latest digital surveillance tool on a cultural problem with deep, tangled roots doesn’t address the complex crises underlying violence in the United States, and may in fact exacerbate them.

Lockport, New York is showing school districts across the country what not to do. But this fight is only beginning, and we can make sure other schools don’t follow down their dangerous path. To do that, here in Massachusetts, the state legislature must pass a moratorium to press pause on the government’s use of face recognition technology, before schools in our state begin to experiment with this dystopian software. The benefits of using the technology are unlikely to ever materialize, but the harms are a clear and present danger.

© 2024 ACLU of Massachusetts.