Facial recognition introduced to US schools
Hopes new tech will prevent school shootings
Your support helps us to tell the story
From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.
At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.
The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.
Your support makes all the difference.Jim Shultz tried everything he could think of to stop facial recognition technology from entering the public schools in Lockport, a small city 20 miles east of Niagara Falls.
He posted about the issue in a Facebook group called Lockportians. He wrote an Op-Ed in The New York Times. He filed a petition with the superintendent of the district, where his daughter is in high school.
But a few weeks ago, he lost. The Lockport City School District turned on the technology to monitor who’s on the property at its eight schools, becoming the first known public school district in New York to adopt facial recognition, and one of the first in the nation.
The district, said Mr Shultz, 62, “turned our kids into lab rats in a high-tech experiment in privacy invasion.”
The decision underscores how facial recognition is spreading across the country and being deployed in new ways in the United States, as public officials turn to the technology in the name of public safety.
A few cities, like San Francisco and Somerville, Massachusetts, have barred their governments from using the technology, but they are exceptions. More than 600 law enforcement agencies started using the technology of one company, Clearview AI, in just the past year. Airports and other public venues, like Madison Square Garden in Manhattan, have adopted it as well.
Schools are a newer front, and the debate that took place in Lockport encapsulates the furore surrounding the technology. Proponents call it a crucial crime-fighting tool, to help prevent mass shootings and stop sexual predators. Robert LiPuma, the Lockport City School District’s director of technology, said he believed that if the technology had been in place at Marjory Stoneman Douglas High School in Parkland, Florida, the deadly 2018 attack there may never have happened.
“You had an expelled student that would have been put into the system, because they were not supposed to be on school grounds,” Mr LiPuma said. “They snuck in through an open door. The minute they snuck in, the system would have identified that person.”
But opponents like Mr Shultz say the concerns about facial recognition — namely privacy, accuracy and racial bias — are even more worrisome when it comes to children.
“Subjecting 5-year-olds to this technology will not make anyone safer, and we can’t allow invasive surveillance to become the norm in our public spaces,” said Stefanie Coyle, education counsel for the New York Civil Liberties Union. “Reminding people of their greatest fears is a disappointing tactic, meant to distract from the fact that this product is discriminatory, unethical and not secure.”
The debate in Lockport has unfolded over nearly two years. The school district initially announced its plans to instal a facial recognition security system, called Aegis, in March 2018. The district spent $1.4 million (£1.1m), with money it had been awarded by the state, to instal the technology across 300 cameras.
But when administrators wanted to do a test run last May, the State Education Department told them to hold off, partly in response to mounting public concerns over student privacy. The state wanted Lockport to make sure that students’ data would be properly protected, and demanded a policy that would forbid the use of student data, including their photos.
By June, Lockport officials said they had adjusted their policies, and they began testing parts of the system. In late November, the State Education Department said the district’s revised policy addressed its concerns. In January, the school board unanimously approved the latest policy revision.
When the system is on, Mr LiPuma said, the software looks at the faces captured by the hundreds of cameras and calculates whether those faces match a “persons of interest” list made by school administrators.
That list includes sex offenders in the area, people prohibited from seeing students by restraining orders, former employees who are barred from visiting the schools and others deemed “credible threats” by law enforcement.
If the software detects a person on the list, the Aegis system sends an alert to one of 14 rotating part- and full-time security personnel hired by Lockport, Mr LiPuma said. The human monitor then looks at a picture of the person in the database to “confirm” or “reject” a match with the person on the camera.
If the operator rejects the match, the alert is dismissed. If the match is confirmed, another alert goes out to a handful of district administrators, who decide what action to take.
The technology will also scan for guns. The chief of the Lockport Police Department, Steven Abbott, said that if a human monitor confirmed a gun that Aegis had detected, an alert would automatically go to both administrators and the Police Department.
If the Aegis system sent an alert to the department and the police could not reach anyone at the school to confirm the threat, Mr Abbott said, “it would be treated as a live situation.”
Days after the district announced that the technology had been turned on, some students said they had been told very little about how it worked.
“I’m not sure where they are in the school or even think I’ve seen them,” said Brooke Cox, 14, a first year pupil at Lockport High School. “I don’t fully know why we have the cameras. I haven’t been told what their purpose is.”
Others, like Tina Ni, 18, said the new technology and the news coverage of her school were “cool.”
Critics of the technology, including Mr Shultz and the New York Civil Liberties Union, point to the growing evidence of racial bias in facial recognition systems. In December, the federal government released a study, one of the largest of its kind, that found that most commercial facial recognition systems exhibited bias, falsely identifying African American and Asian faces 10 to 100 times more than Caucasian faces. Another federal study found a higher rate of mistaken matches among children.
In Lockport, black students are disproportionately disciplined. In the 2015-16 school year, 25 per cent of suspended students in the district were black even though enrolment was only 12 per cent black, according to data from the federal Department of Education.
Mr LiPuma, the director of technology, said he believed that Lockport’s system was accurate. He also said he, as well as some other school officials, would like to add suspended students to the watch list in the future, despite the State
Education Department’s recent directive that Lockport make it clear in its policy that it is “never” to use the system “to create or maintain student data.” Most school shootings in the past decade, Mr LiPuma said, were carried out by students.
“The frustration for me as a technology person is we have the potential” to prevent a school shooting, he said. “If something happens, I’m not going to feel any better about that, but it wasn’t my decision. That’s on State Ed.”
Jason Nance, a law professor at the University of Florida who focuses on education law and policy, warned that listing students as “persons of interest” could have unintended consequences.
“If suspended students are put on the watch list, they are going to be scrutinised more heavily,” he said, which could lead to a higher likelihood that they could enter into the criminal justice system.
Jayde McDonald, a political science major at Buffalo State College, grew up as one of the few black students in Lockport public schools. She said she thought it was too risky for the school to instal a facial recognition system that could automatically call the police.
“Since the percentages for the false matches are so high, this can lead to very dangerous and completely avoidable situations,” Ms McDonald said.
She added that she believed police officers would “do whatever it takes in order to stop a suspicious person,” even if that person was a young student in school.
Opponents of the new technology now pin their hopes on state lawmakers. In April, Assemblywoman Monica Wallace, Democrat for Lancaster, introduced a bill that would force Lockport to halt the use of facial recognition for a year while the State Education Department studied the technology. The bill easily passed in the Assembly but was not taken up by the Senate.
Ms Wallace said she intended to make passing the bill a priority in this new legislative session.
“We all want to keep our children safe in school,” she said. “But there are more effective, proven ways to do so that are less costly.”
She said school districts could, for instance, take smaller steps like upgrading entrances and exits, hiring school resource officers, and investing in counsellors and social workers.
Mr Shultz said he would keep making his case.
“Hopefully, other districts around the country will learn from Lockport’s dumb mistakes,” he said. “A decision to bring facial recognition and artificial intelligence into our schools ought to be the subject of a serious conversation.”
The New York Times
Join our commenting forum
Join thought-provoking conversations, follow other Independent readers and see their replies
Comments