Facial recognition technology continues to be hotly debated at the state and local level as privacy advocates, politicians and other stakeholders debate the pros and cons of regulating the technology or adopting outright bans.
Although nearly 20 states considered some level of regulation or blanket bans on the technology in 2020 and 2021, only a handful of the legislative efforts were adopted, according to Jake Parker, senior director of government relations for the Security Industry Association (SIA).
“Broadly prohibiting government agencies from using the technology has actually been rejected much more than they’ve been taken seriously,” he says. “Over the last two legislative cycles they have been rejected in 17 states. Rejected either by not being taken up and considered or actually being voted down.”
Several states have enacted some conditional use restrictions of facial recognition. Effective July 1, Washington adopted the most expansive law so far that imposes a series of conditions on any public sector agency, including law enforcement use of the technology. Massachusetts has a restriction that applies only to law enforcement purposes and poses some conditions. Earlier this year, Utah also established conditions for law enforcement use, both throughout the state and by the state’s Homeland Security office.
Advocacy organizations led by the ACLU, along with other coalitions, have had the most success in banning facial recognition at the local level. Eighteen jurisdictions at present have prohibited use of the technology by city agencies. However, in 2021 only three municipalities considered such bans. King County, Wash., and Minneapolis adopted bans on government use, and in June the Baltimore City Council implemented what is thought to be the most expansive ban of any jurisdiction by restricting personal and business use.
“It’s only the second jurisdiction to include private-sector restrictions, after Portland, Ore.,” says Parker.
Notably, among several application-specific carveouts included in these measures, the ordinances in Minneapolis and Baltimore exclude use of facial recognition technology within access control and security systems from the prohibition.
Barring law enforcement’s application of facial recognition can negate positive use cases, such as investigating child pornography, human trafficking and other crimes, explains Christian Quinn, senior director, government affairs, for consulting firm Brook Bawden Moore.
He recently retired after serving 25 years as a senior leader with the Fairfax County Police Department in Virginia. As a police major, Quinn led the establishment of a cyber and forensics bureau, dealing with emerging trends related to digital evidence and the need to adopt technology in a manner that balances security and privacy.
“I think anyone who is running a facial recognition program, we’re in favor of regulation. We don’t want a Wild West environment. We want best practices, best standards; we want to have those stipulations, like effective algorithms, humans in the loop, defined use cases, transparency,” he says.
Quinn cites the example of a typical child exploitation case where a seized mobile device may hold thousands of digital images.
“What you don’t want to have is a situation where an examiner has to go through image by image by image to determine what is the nature of [every single] image,” he says.
Some facial recognition tools will group together like images by leveraging artificial intelligence (AI). Examiners are then able to determine how prolific the suspect or offender is with respect to their child pornography collections. Do they have the same victims, multiple victims? Are there missing identifiable victims who might be able to be recovered?
“When you outright ban digital facial comparisons, you throw out those tools with identification technology, sometimes inadvertently, just because that’s not considered,” Quinn cautions.
SIA conducts a working group of more than 30 companies to advise and participate in legislative advocacy and communications efforts for facial recognition. Parker says a common misconception they hear is that there are no rules in place about using the technology. It may be the case no legislation has been enacted, but most major jurisdictions using facial recognition are bound by rules of procedure, he says.
Lumping all uses of the technology under the label of “surveillance” also foments a lot of misconceptions and fallacies, which only serve to fuel calls for bans.
“Surveillance is a scary word for a lot of people,” says Parker. “And even if there’s no surveillance involved, if that’s what you think it is, you’re going to have a different attitude.”
Rodney Bosch is senior editor for CS sister publication, Security Sales and Integration. Content derived from a virtual ESX 2021 panel.)