Skip Navigation
collage of staff members

Help Us Raise $50K by 12/31!

Every donation will be doubled until we reach our goal - Give Today!
Get updates:

We respect your privacy

Thanks for signing up!

Last week, a broad coalition of civil rights, civil liberties, privacy and immigrant-rights groups met with representatives from the FBI and Justice Department to demand more transparency around the use of an increasingly popular law enforcement tool: facial-recognition technology.

The meeting was in response to a recent report from the Georgetown Law Center on Privacy & Technology that found that law enforcement agencies across the country are adding this technology to their arsenal of investigatory tools.

While the report found that the practice affects over 117 million people, agencies across the board have failed to put in place safeguards to protect our privacy.

Worse yet, while the technology potentially threatens the rights of everyone in America, the report uncovered damning racial biases within the systems.

How it works

Facial-recognition technology works by running a photo against an algorithm that attempts to pull distinct characteristics — like cheekbone or eye position — and compare them to a connected database. It then returns either the most similar-looking faces or all photos above a certain similarity threshold.

The system is a network of various state and federal databases that are generally built of drivers’ licenses and mug shots. These databases sweep up millions of innocent Americans without their knowledge and put them into what Georgetown calls the “perpetual lineup.”

It’s the first time in our nation’s history that the FBI has maintained a biometric database made up primarily of innocent people.

Law enforcement might run a search on a person while on a routine stop. Officers might run a photo taken from a surveillance camera at the scene of an alleged crime.

Particularly worrisome is the fact that some agencies have expressed interest in running searches in real time of public spaces.

This kind of persistent surveillance has serious implications for our ability to simply move through life with anonymity. Even more concerning, it may impact people’s ability to exercise their First Amendment rights without being identified and targeted by the government.

While jurisdictional controls exist to prevent (in theory) each agency from taking extraterritorial action outside its territory, there’s extensive sharing among the many systems.

For example, more than 5,300 officials from 242 different federal, state and local agencies can access Pinellas County, Florida’s database, which has been in place since 2001.

The FBI hosts one of the nation’s largest facial-recognition databases, built mostly from mug shots submitted by local, state and federal law enforcement agencies. In total, the database contains nearly 25 million photos.

Racial bias

While law enforcement has argued that the new technology is colorblind, the Georgetown report points to several studies that have found racial bias in these systems.

The most prominent of these — co-authored by FBI expert Richard W. Vorder Bruegge — found that several of the leading algorithms were 5­–10 percent less accurate when used to identify Black people.

One of the algorithms failed to identify White people accurately 11 percent of the time, but that failure rate jumped to 19 percent when the subject was Black.

While a false negative could result in police missing an important lead, a false positive could implicate innocent people. As the Georgetown report points out, many systems are programmed to deliver what the algorithm thinks are the closest matches for a particular face even if there’s no degree of certainty about the identity, potentially leading to investigations of innocent people.

The report suggests that one source for such dangerous and deeply ingrained bias is that the technology best recognizes the kinds of people who create it.

It points to a study from the National Institute of Standards and Technology that tested for facial-recognition accuracy on East Asian and White people. Researchers found that algorithms developed in East Asia performed better on East Asians while those created in Western Europe performed better on White people.

Drawing on their own knowledge base and the nuances they themselves can see, the people who write the code tend to do better at recognizing people in their own racial and ethnic demographic groups.

But the tech industry is notoriously deficient when it comes to hiring people of color.

A recent study found that while Latinx people represented 8 percent of those graduating with computer-science degrees and Black people represented 6 percent of that group, they account for only 3 percent and 1 percent of the workforce at major tech firms, respectively.

The racial-justice implications for the growing use of facial-recognition technology are exacerbated by other alarming issues.

For example, the databases often use arrest records, where Black people are overrepresented due to existing biases in policing. Consequently, the technology is the least accurate on the people that it’s most often trying to identify.

The ramifications of this discrepancy are problematic for a number of reasons. But when they reinforce bias that’s already built into problematic police practices, the consequences can be deadly.

Despite all this, the Georgetown report found that two of the major companies producing this technology didn’t even test for racial bias before taking their products public.

First Amendment implications

Pervasive surveillance of public spaces compromises our right to anonymous speech.

Officers may take photos or film video, or stationary surveillance cameras may constantly and quietly record public spaces. In either case, the ability to identify people attending political protests from a distance and long after the fact could have a chilling effect on free speech.

While direct interaction between protesters and law enforcement can be chilling, or even dangerous, at least in those cases people have heightened awareness of the police presence and potential surveillance. When the filming is done more surreptitiously and then matched up with facial-recognition tools in real time or afterwards, the effects could be startling.

That reality is closer than we think.

Sen. Al Franken recently called out the FBI for photos found in the agency’s internal documents that depict people at Clinton and Sanders rallies being tracked with facial-recognition technology. Last August, the FBI itself released 18 hours of spy-plane footage it had taken at Black Lives Matter protests in Baltimore in spring 2015.

While the right to anonymous speech is vital to everyone in a democracy, the stakes for those protesting racial injustice are particularly high.

One case that illustrates this fact is NAACP v. Alabama: During the height of the civil rights movement, the state tried to force the group’s state chapter to disclose the names and addresses of all of its members.

Recognizing the danger this could put those members in, the chapter refused. In 1958, the Supreme Court ruled unanimously in favor of the NAACP, arguing that the state order "would suppress legal association among the group’s members — in fact, earlier disclosures of member identities had led to loss of employment, physical coercion, and other hostile treatment."

The Supreme Court recognized then that the rights to association and privacy are intrinsically linked, particularly for those engaged in protest and dissent. Some states and law enforcement agencies have made efforts to rein in the use of many other new police technologies that could contribute to pervasive surveillance (like body-worn cameras and drones).

But according to the Georgetown report, not one state has passed a comprehensive law regulating law enforcement’s use of facial-recognition technology — despite the clear harms to both civil rights and civil liberties.

In its meeting last week, the coalition, which included Free Press, asked the FBI to shed additional light on the use of this powerful technology. We need to address those dangers sooner rather than later.

More Explainers