Virginia’s New Rules for Facial Recognition and What They Mean

police officer using facial recognition

Updated May 6, 2022

On April 27, 2022, S.B. 741 was enacted in Virginia, replacing the state’s complete ban on local law enforcement use of facial recognition technology with the most stringent law in the country regulating its use. 

Introduced by Virginia Delegate Jay Leftwich (R-Chesapeake) and Sen. Scott Surovell (D-Fairfax), the measure to establish statewide rules was passed with bipartisan support from the majority of each party in Virginia’s Republican-controlled House of Delegates (31 Republicans and 23 Democrats in support) and Democrat-controlled Senate (11 Republicans and 16 Democrats in support), including all Senate members of the Virginia Legislative Black Caucus.

The ban, in place only since July 2021, had quickly passed earlier that year with little review during the peak of the COVID-19 pandemic. Meanwhile, lawmakers became concerned with the public safety impact of the prohibition. Recent crimes in Northern Virginia following the ban loomed large in discussions of the bill, including the serial murderer known as the “shopping cart killer,” the murder of a Black man at a Falls Church ATM and other investigations in which leveraging the technology could have provided an advantage over relying only on eyewitness descriptions, public announcements or soliciting anonymous tips to develop leads. The fact that some agencies had lost their ability to use these tools critical to solving criminal cases, finding missing children and protecting the public, illustrated by a number of success cases reported prior to the ban, created additional urgency for the establishment of statewide rules that would restore them.

Retired Maj. Christian Quinn, a former law enforcement official who led an analysis unit in Northern Virginia, shared during a committee hearing on Feb. 9, 2022, that the technology had been leveraged in thousands of cases with “no misidentifications or negative outcomes, and a myriad of success cases,” adding that “eyewitness identification can be very challenging and it leads to a lot of the wrong people getting roped into investigations that don’t need to be. We had an exoneration where someone brought us an image and we told the detectives ‘this is not your person’ as he was incarcerated at the time of.”

Delegate Cliff Hayes (D-Chesapeake) spoke in support of the bill on March 3, saying, “I’m not looking at this through a political lens, but through my professional experience of over 25 years in this space, and by the way, a Black man that has been profiled a lot of different ways and situations, so I don’t take it lightly, when I look at how this is going help in a whole lot of situations.”

Limitations

The Virginia facial recognition bill limits law enforcement use of the technology to 14 enumerated purposes that align with longstanding use cases for U.S. law enforcement investigations, such as helping identify a crime victim, witness or suspect by comparing a photo of the unknown individual with a database of other available photos. Also specifically authorized in the bill are public welfare scenarios, such as helping a person who is not able to identify themselves and helping identify a missing or deceased person.

This process used is often compared with analyzing latent prints from a crime scene, though unlike fingerprint and DNA matching any potential facial recognition match is not considered positive identification. If an analyst using the software determines an image from a database likely matches a submitted image of a suspect, other means outside of facial comparison should be used to provide confirming evidence needed to establish probable cause. 

Virginia lawmakers recognized the need to ensure potential match results are never used the sole basis to establish probable cause for an arrest. In fact, they even went a step further with the facial recognition legislation, by adding prohibition on including facial recognition match results in any affidavit to establish probable cause for a search warrant or arrest warrant.

During floor debate on March 3, Delegate Glenn Davis (R-Virginia Beach) reiterated, “All we are doing here is, instead of having a law enforcement officer sit there and look through thousands of photos praying and trying to find that match, we use technology that helps create that efficiency…unlike the movies when there is a match, this isn’t Minority Report – the computer does not issue an arrest warrant…when there is a potential match, then human eyes have to look at it and say, yes, that looks like  a probable match, and we can continue down the same route that already happens under the status quo, when an officer with their own eyes looks at two pictures and says there is potentially a match.”

Additionally, to address concerns raised during committee and floor consideration of the measures, amendments were made to further ensure that the technology will not be used outside of these purposes for “surveillance” by prohibiting real-time tracking of a known person’s movements in public spaces. In a Feb. 28 committee hearing, bill sponsor Sen. Surovell reminded colleagues that “new technology often creates advantages and efficiencies that help us have a safer society. [Under the bill] It’s not monitoring, not surveillance, it can only be used for specific purposes.”

Amendments were also adopted to ensure the Virginia facial recognition bill’s limitations apply to the Virginia State Police as well in addition to local law enforcement, as the ban had only restricted local use. This became a key issue for many lawmakers. In urging members to support the bill on March 3, bill sponsor Delegate Leftwich put it this way: “The state police are using this technology. Do you want them to use it with guardrails or not? Simple as that.” 

Surovell later emphasized, “This is way better [than the status quo] because it creates guardrails and restrictions, accountability, transparency, consequences for violating the law, whereas right now it’s wide open, except for the Constitution.”

Agency Requirements

Prior to using facial recognition technology, a local law enforcement agency must notify its local governing body and develop and publicly post a use policy that meets or exceeds statewide standards in a model policy to be developed by the Virginia State Police. Those standards will include elements such as the nature and frequency of training for personnel using the technology; procedures for human review of query results; and the documentation, preservation and protection of data from queries conducted.

Once an agency is using the technology, it must maintain records on the program to facilitate discovery in criminal proceedings, periodic audits and public reporting. These agencies are required to produce an annual report detailing what technologies and algorithms are used, the nature of databases queried, the number of queries performed and type of crime related to each and other information.

Technology Standards

Under the new bill, Virginia will be the first state in the nation to require facial recognition technologies for law enforcement to be evaluated by the National Institute of Standards and Technology (NIST), which many deem essential to ensuring use of the highest-quality technology. The additional requirements that the technology achieve 98% accuracy on relevant tests as well as “minimal performance variations across demographics” reflects a growing interest among lawmakers in establishing a minimum accuracy standard such applications. Any vendors providing the technology to law enforcement will have to first be approved by Virginia’s Division of Purchases and Supply as meeting these standards. 

While some questions were raised during debate on the measures about pegging this standard to performance in NIST’s testing program, it is worth noting that this federal program is used to validate technologies for U.S. government applications where highly accurate performance is critical to national and homeland security. In fact, the NIST Face Recognition Vendor Test program has become the world standard for objective third-party scientific evaluation, which provides an apples-to-apples comparison of the performance of facial recognition technologies. Though not currently configured to establish or certify an accuracy standard, the range of tests periodically conducted under the NIST program includes those with relevance to law enforcement applications (notably the “Investigation Performance” tests) against images of varying quality and demographics and using data sets similar to or larger in size than what would be available to law enforcement agencies (up to 12 million images).

It is likely that NIST testing will soon become even more robust. On March 10, the same day the Virginia facial recognition measure was cleared by the state’s general assembly, the U.S. Congress passed its fiscal 2022 omnibus appropriations bill, which doubles funding over what the Senate originally proposed to bolster NIST “AI research and measurement science efforts” as part of a larger $62 million increase to NIST’s Scientific and Technical Research and Services programs. Within the $31 million appropriation, Congress directed NIST to expand testing activities in order to “meet growing demand for the Facial Recognition Vendor Test and to improve the test” as outlined in a previous report recognizing the program as “an important resource for government, commercial and academic developers to assess the quality of their facial recognition technologies. As more companies and government users invest in this technology, the test will continue to be a critical step for responsible use.” Specifically, NIST is directed to “expand testing to include a more diverse combination of demographics and environmental settings in the test data” among other improvements.

Emerging Trend

Virginia is the latest and largest jurisdiction where policymakers are rethinking blanket bans imposed on facial recognition. After Vermont had passed the only other statewide measure banning law enforcement use in October 2020, policymakers quickly discovered that the ban extended even to software for analyzing seized digital evidence in child sexual exploitation cases. The legislature returned in 2021 to pass a bill exempting such software. Minneapolis passed an ordinance in 2021 prohibiting city agencies from using the technology but included a specific exemption allowing the city police department’s child sex crimes unit to continue using facial recognition tools. Four months after Portland, Oregon’s ban on facial recognition went into effect in 2021, the city’s mayor begged for assistance from the public and from other agencies to identify those responsible for acts of violence during demonstrations in the city, even urging them to film events and provide information to “unmask” them. More recently, the cities of San Francisco and New Orleans are reportedly considering amending recent ordinances that prohibit use of facial recognition technology by city police departments.

Public concerns about facial recognition technology have centered on law enforcement and fears the technology might be used inaccurately or inappropriately, or in ways that raise privacy and civil liberties concerns. At the same time, it is clear there is significant public support for appropriate use to solve crime and protect communities. An October 2021 Zogby poll of Virginians found that 75% see the use of facial recognition technology by law enforcement in general as appropriate and beneficial, consistent across political affiliations and specifically for finding missing persons and children (87%) and solving crimes (77%), which is consistent with earlier nationwide polling commissioned by the Security Industry Association (SIA). 

It’s clear there is growing interest in approaches that address concerns about facial recognition technology while ensuring law enforcement agencies are leveraging the software in a bounded, effective, accurate and nondiscriminatory way that benefits the communities they serve. SIA has published its Principles for the Responsible and Effective Use of Facial Recognition Technology as a resource to assist stakeholders in these efforts, and we continue to closely track policy developments affecting biometric technologies.