EXECUTIVE SUMMARY:

What if technology could be used to keep students safe by installing equipment that could detect bad actors before they even made an opening move? A school in upstate New York hopes to do just that. But judging from how the effort underway is starting out, one questions how effective the initiative will actually be. As with most issues in cybersecurity today, it all comes down to how well data is managed from a security and privacy perspective.

Lockport City schools are installing a facial-recognition-based surveillance system, developed by Aegis, at six elementary schools, a middle school, a high school, and an administration building as a way to potentially guard against the threat of a mass shooting. Motherboard describes the system this way: “A school using the platform installs a set of high-quality cameras, good enough to detect individual student faces, before determining exactly which biometrics it thinks must set off the system. Crucially, it’s up to each school to input these facial types, which it might source from local police and mug-shot databases, or school images of former students it doesn’t want on its premises. With those faces loaded, the Aegis system goes to work, scanning each face it sees and comparing it with the school’s database. If no match is found, the system throws that face away. If one is, Aegis sends an alert to the control room.”

Setting aside the fact that in many school shootings, the perpetrator has been a student and setting aside concerns that this kind of surveillance could be an invasion of privacy, the big issue is whether or not this data will be kept secure. With so many data breaches and leaks occurring regularly due to misconfigured servers or weak security measures, the amount of personal data that has been exposed is alarming.

In a separate report today in Motherboard, an app designed to help teachers and parents share information about students’ milestones and progress was found to be negligent in protecting the data it was trusted with. The Remini app leaked images, documenting notes, and personally identifiable information due to an API that let anyone pull the data without authentication.

In Lockport, the surveillance program sounds like it could already be off to a rocky start. According to Motherboard, The New York arm of the American Civil Liberties Union (NYCLU) “found nothing in the documents outlining policies for accessing data collected by the cameras, or what faces would be fed to the system in the first place.” On top of that, the NYCLU indicated that Lockport administrators appear to have significant shortcomings when it comes to managing servers, files, passwords, and access.

Quoting NYCLU, Motherboard reports, “‘The serious lack of familiarity with cybersecurity displayed in the email correspondence we received and complete absence of common sense redactions of sensitive private information speaks volumes about the district’s lack of preparation to safely store and collect biometric data on the students, parents and teachers who pass through its schools every day,’ an editor’s note to the NYCLU’s statement on the Lockport documents reads.”

Last week, Apple CEO Tim Cook pushed for stricter privacy controls in the face of so many companies failing to safeguard and respect people’s data. In the case of Lockport, given that we’re talking about kids from elementary-school age on up through high school, the sensitivity of that data and the danger of it falling into the wrong hands is pronounced.

Businesses and communities need to consider the ramifications of how their cybersecurity actions/inactions can have lasting impact on the lives of those they serve.

Get the full story at Motherboard.