AI-Based School Surveillance Raises Student Privacy Concerns

New York’s State Education Department’s decision to approve the installation of facial recognition software in a school district has raised questions.

AI-Based School Surveillance Raises Student Privacy Concerns

Last week, New York’s State Education Department (SED) gave the go-ahead for the state’s first school facial recognition systems, creating buzz around AI-driven surveillance and how minors’ images may be stored and shared.

In July, SED had banned the Lockport City School District from using or testing facial recognition due to privacy and data safety concerns. Parents and civil liberties groups asked the district to wait until the state finalized guidelines on how the images can be used.

Following several months of meetings between Lockport school leaders and state education officials, SED approved the cameras’ use after the district agreed to revise its privacy policy to ensure no student data would be detained, Times-Union reports.

“With these additional revisions, the Department believes that the Education Law issues it has raised to date relating to the impact on the privacy of students and student data appear to be addressed,” Temitope Akinyemi, chief privacy officer at SED, wrote in a letter.

The $1.4 million system, which was funded by taxpayers through the Smart Schools Bond Act of 2014, relies on the Aegis software suite created by SN Technologies. An investigation by the New York Civil Liberties Union’s Education Policy Center (NCLU) found the security consultant who recommended that Lockport install facial recognition software may have received financial benefits from Aegis and the electrical company that installed the system.

“The pattern in schools we are seeing is really technology in search of a market,” said Johanna Miller, a civil rights attorney and director at NCLU’s Education Policy Center. “These are tech start-ups that see deep pockets … and school districts are put in a position to go way out of their comfort zone and evaluate these vendors.”

Cybersecurity and privacy experts warn there is no way to prevent tech companies from using facial impressions of students to fine-tune their algorithms, according to Times-Union. Experts also warn of potential bias and false positives as studies have shown facial recognition programs often disproportionately flag people of color, women and young people.

“We don’t think the State Education Department has done its due diligence in really getting a grasp on how this technology works,” Miller added. “It seems from the letter that they are not familiar with the technology at all. The concept that no student data will be retained is flawed.”

Other AI-driven surveillance systems being installed in some N.Y. school districts include vape detectors, and programs that monitor social media or log keystrokes on school-issued devices to flag words associated with bullying or self-harm. At Hudson Valley Community College in Troy, school officials are attempting to install cameras in classrooms, which professors argue violates their contract and infringes on academic freedom.

Debates surrounding the use of biometric technology by public entities are happening all over the country. In May, San Francisco banned the use of facial recognition technology by police and other governing departments.

New York recently eliminated fingerprinting to determine eligibility for programs such as food stamps or Medicaid, citing criticism that the process was invasive and a potential deterrent to applicants. In July, New York Governor Andrew Cuomo also enacted the Stop Hacks and Improve Electronic Data Security (SHIELD) Act, which broadens the definition of “private information” to include biometric data and limits how companies can handle the data.

Also in New York, the state Legislature approved two education laws that prohibit the state from sharing student information with databanks, specify how schools and vendors must secure student data, and ban the sale of personal student information or its use for marketing purposes.

Antony Haynes, an Albany Law School faculty member who serves as the director of cybersecurity and privacy law, told Times-Union that without clear guidelines dictating how all of this collected data can be handled, schools are creating a digital record of student activity that could have lifetime consequences.

“The real danger is that records are not being deleted,” he said. “The concern is that it can become a permanent record and that it can be sold to third-party vendors.”

The FBI’s Internet Crime Complaint Center also warned last year that “the widespread collection of sensitive information by ed-tech could present unique exploitation opportunities for criminals.”

If you appreciated this article and want to receive more valuable industry content like this, click here to sign up for our FREE digital newsletters!

About the Author

Contact:

Amy is Campus Safety’s Executive Editor. Prior to joining the editorial team in 2017, she worked in both events and digital marketing.

Amy has many close relatives and friends who are teachers, motivating her to learn and share as much as she can about campus security. She has a minor in education and has worked with children in several capacities, further deepening her passion for keeping students safe.

Leading in Turbulent Times: Effective Campus Public Safety Leadership for the 21st Century

This new webcast will discuss how campus public safety leaders can effectively incorporate Clery Act, Title IX, customer service, “helicopter” parents, emergency notification, town-gown relationships, brand management, Greek Life, student recruitment, faculty, and more into their roles and develop the necessary skills to successfully lead their departments. Register today to attend this free webcast!

Leave a Reply

Your email address will not be published. Required fields are marked *

Get Our Newsletters
Campus Safety Conference promo