NYPD Tampered with Facial Recognition Results, Researchers Say

A report claims the NYPD would edit photos and upload celebrity lookalikes into facial recognition software to identify suspects.

NYPD Tampered with Facial Recognition Results, Researchers Say

At least half a dozen police departments across the country permit, if not encourage, the use of face recognition searches on forensic sketches, the report said.

A new report on law enforcement’s use of surveillance technology said the New York Police Department abused its facial recognition system by editing suspects’ photos.

Researchers from the Georgetown Center on Privacy and Technology, who specializes in facial recognition, obtained documents for their investigation into the NYPD, reports NBC News.

They found that the department was editing photos and uploading celebrity lookalikes into the facial recognition software in an effort to identify people wanted for crimes.

In the report, a case is detailed from April 2017 where NYPD investigators were trying to identify a man caught on surveillance stealing beer from CVS. The researchers said the image was not high quality and did not produce any potential matches in the facial recognition system.

A detective, however, noted that the suspect looked like actor Woody Harrelson, so an image of the actor was submitted in the suspect’s place.

From a new list of results, detectives found a man they believed was a match and arrested him, according to the researchers.

The report also found evidence that the NYPD doctored images of suspects to make them look more like mugshots. To do so, they would replace facial features using photos of a model taken from Google.

“These techniques amount to the fabrication of facial identity points: at best an attempt to create information that isn’t there in the first place and at worst introducing evidence that matches someone other than the person being searched for,” the report said.

The report calls to ban police use of the technology, which cities like San Francisco just passed last week.

“It doesn’t matter how accurate facial recognition algorithms are if police are putting very subjective, highly edited or just wrong information into their systems,” said Clare Garvie, the report’s author and senior associate at the Center on Privacy and Technology. “They’re not going to get good information out. They’re not going to get valuable leads. There’s a high risk of misidentification. And it violates due process if they’re using it and not sharing it with defense attorneys.”

The report also documented incidents in Maricopa County, Arizona, Washington County, Oregon and Pinellas County, Florida about officers misusing facial recognition technology.

“At least half a dozen police departments across the country permit, if not encourage, the use of face recognition searches on forensic sketches—hand drawn or computer generated composite faces based on descriptions that a witness has offered,” the report said.

The Washington County Sheriff’s Office said in a statement that it “has actually never used a sketch with our facial recognition program for an actual cause. A sketch has only been used for demonstration purposes, in a testing environment.”

The Pinellas County Sheriff’s Office responded similarly, while the Maricopa County Sheriff’s Office said they no longer maintain a facial recognition system.

The NYPD initially fought Georgetown’s efforts to obtain information about how its facial recognition system worked, but ultimately handed over thousands of pages in documents.

The department said in a statement that is “has been deliberate and responsible in its use of facial recognition technology” and has used it to solve a variety of crimes.

Many officers say that facial recognition technology helps solve cases that otherwise would have gone cold, like in homicides, rapes, or attacks in the city’s subway system where the person is often unidentified.

The department did not dispute the facts stated in the Georgetown report but said it is reviewing its facial recognition protocols.

The report also provides recommendations for law enforcement agencies that choose to continue to use face recognition in their investigations:

  • Stop using celebrity look-alike probe images.
  • Stop submitting artist or composite sketches to face recognition systems not expressly designed for this purpose.
  • Follow minimum photo quality standards, such as pixel density and the percent of the face that must be visible in the original photo.
  • Carefully document any edits made to the image and their results.
  • Prohibit the use of face recognition as a positive identification under any circumstance.

You can see the full report and its recommendations here.

If you appreciated this article and want to receive more valuable industry content like this, click here to sign up for our FREE digital newsletters!

About the Author

Contact:

Katie Malafronte is Campus Safety's Web Editor. She graduated from the University of Rhode Island in 2017 with a Bachelor's Degree in Communication Studies and a minor in Writing & Rhetoric. Katie has been CS's Web Editor since 2018.

Leading in Turbulent Times: Effective Campus Public Safety Leadership for the 21st Century

This new webcast will discuss how campus public safety leaders can effectively incorporate Clery Act, Title IX, customer service, “helicopter” parents, emergency notification, town-gown relationships, brand management, Greek Life, student recruitment, faculty, and more into their roles and develop the necessary skills to successfully lead their departments. Register today to attend this free webcast!

Leave a Reply

Your email address will not be published. Required fields are marked *

Get Our Newsletters
Campus Safety Conference promo