Is Facial Recognition Software Racist?


Today’s post is part 1 of a 2 part series and is brought to us by Alex Janssen. Alex is a current 1L at University of Montana School of Law. She received her undergraduate from the University of Denver, and has served two years in the Peace Corps. She currently serves as the VP for MBLSA, 1L Rep for ACLU, and is a member of WLC and NLG respectively. She is also a naturalization interview mentor for Lutheran Family Services in Colorado and a high school mock Congress judge for the Nebraska circuit.


Spoiler alert: yes. Issues with facial recognition software (FRS) have been on development companies’ radars for years now. In 2018, Brian Brackeen, the CEO of an FRS developer, Kairos, wrote an article for Tech Crunch detailing the many flaws of the software available on the market. Brackeen suggested it was not ready for use by law enforcement. Prominent examples of FRS getting it wrong, such as Michelle Obama and Serena Williams being incorrectly classified as male, are featured in the documentary Coded Bias. The film serves as an exposé of the problems associated with FRS. A leading researcher on algorithmic bias, Joy Buolamwini, “discovered” the flaws in facial recognition software when she was working on a project at MIT and was unseen by the software until she put on a white mask (see photo). Fast forward to 2020, these same concerns are gaining traction with the general public in light of the Black Lives Matter movement, calls for police reform and abolition, and a global pandemic forcing much of day-to-day life online. 

BUOLAMWINI MASK, Source: NPR

What Is Facial Recognition Software (FRS)?

Often thought of as harmless technology, facial recognition systems are used everyday all over the world. If you have a current smartphone, odds are you unlock it via biometrics, either by letting it recognize your face or using your fingerprint. Facebook and other social media platforms know which photos you are in. Many companies grant employees access to buildings via FRS. Facial recognition software has been lauded as artificial intelligence that enhances people’s lives, but when used in the criminal justice system this technology has a dark side.

FRS discriminates against Black, Indigenous, and people of color (BIPOC) communities. When used as a law enforcement tool these software applications perpetuate racism in current day policing. Additionally, one of the most pressing concerns with facial recognition software is its largely unregulated use in the United States. According to Kris Hammon, a computer science professor at Northwestern University, questions of legality regarding FRS should be at the forefront. Hammon asks, “Is the result of this technology admissible in court? Who’s going to answer that?” Without any oversight in the implementation of these systems police departments and government agencies can exert unprecedented supervision over people. 

Historical Surveillance of Black Americans

Before exploring some of the challenges associated with individual uses of FRS, it is important to understand the historical context of the surveillance of Black and minority communities in the United States. Dating back to the 18th century, “lantern laws” were widely used in New York to require enslaved people to carry candle lanterns with them after dark in order to be illuminated, to be watched. This kind of constant supervision has evolved over the years and seen many iterations. From Jim Crow laws that maintained supervision through segregation, to COINTELPRO monitoring Black activists such as MLK Jr., Malcolm X, and the Black Panther Party, to the creation of a secret intelligence program targeting “Black Identity Extremists.” The policing of Black bodies became even more prominent following the “broken windows” theory of crime and its subsequent implementation of surveillance measures in BIPOC neighborhoods. Still today, mass surveillance systems used by police are disproportionately installed in Black neighborhoods.

Racism in the Creation of AI Technologies

Another important piece of the FRS puzzle is understanding how these technologies are developed and how that interacts with their implementation by law enforcement and regulatory bodies. The bottom line is this technology itself is biased. Research shows FRS most often misidentifies Black women between 18 and 30 years old. Research done by Buolamwini in 2018 found that Black women were misclassified nearly 35% of the time. Meanwhile, these same algorithms consistently classified white men accurately. These software programs are developed by algorithms that “learn” how to identify a face based on millions of photos. A CBS News article explains the problem with this process lies with the photos chosen to “train” the algorithm. When developers use predominantly white and male photos, the software struggles to recognize people of color and women. The impacts of this racial bias are far reaching and the scope of the problem is staggering. According to an article written by a Harvard bioengineering student “it is estimated that almost half of American adults—over 117 million people, as of 2016—have photos within a facial recognition network used by law enforcement.” The author continues,“this participation occurs without consent, or even awareness, and is bolstered by a lack of legislative oversight.” These systems pose a unique danger to Black people and other communities of color historically impacted by racism and overtly racist surveillance measures. 

FRS ACCURACY AUDIT, Source: Gender Shades Project

Racial Discrimination by Law Enforcement

Many social justice and civil rights organizers have feared for years law enforcement could abuse FRS as a tool used to target people of color. Law enforcement agencies use this technology in a variety of ways, from identifying suspects and making arrests, to cross examining mugshot databases. According to an ACLU article from 2019, the federal government released findings about its own facial recognition algorithms, concluding their systems disparately misidentify people of color, women, children, and elderly persons, and error rates are highest among Black women. It can be difficult to humanize data and comprehend the impacts of facial recognition software on BIPOC communities. In the context of law enforcement “one false match can lead to a wrongful arrest, a lengthy detention, and even deadly police violence.” These outcomes are simply unacceptable and perpetuate the long list of wrongs committed by police against people of color. These software developers and tech companies are therefore complicit in perpetuating systemic racism that exists in our criminal justice system. 

According to a Harvard article by Alex Najibi, many localities in the U.S. use facial recognition programs to identify people within mugshot databases. A number of local jurisdictions also compile their own separate crime databases thereby increasing the number of citizens present in facial recognition systems. Due to racial profiling and other discrimination by law enforcement, such as incentivizations to falsify reports, Black people are overrepresented in these databases. The well documented racist roots of the war on drugs led to disparate arrests of Black people for cannabis use compared to white people, despite cannabis use rates being about the same. This leads to more Black people’s information being stored in these mugshot databases that are then combed by facial recognition software further entrenching this discriminatory cycle. The use of photography in the criminal legal system, primarily through mugshots, has a racist history that contributes to the problematic use of FRS as applied to these databases. A 2019 Commerce Department study identified facial recognition systems that wrongly classified two individuals as the same person and found that error rates within one U.S. mugshot database were highest for Native Americans, followed by Asians, and Black women. Research findings in this area of artificial intelligence continually show that, as usual, we are heavily relying on a practice that works for white men, and no one else, “to keep Americans safe.” 

Another use of FRS that targets marginalized communities is its implementation by ICE and local police to identify undocumented immigrants and Muslim citizens. FRS plays a vital role in helping “fusion centers” to function in their alleged anti-terrorism efforts. Many Americans are actively fighting for the protection of privacy and the ban of facial surveillance by local police departments and federal law enforcement agencies via petition

The criminal legal system is already designed to make Black people suffer. The disparate outcomes have been and will continue to be well documented. Black people overwhelmingly face racism and bias in every part of the process, from arrest to sentencing to incarceration. FRS is simply another means for the persecution of people of color. 

Until Next Week…

In the words of Malkia Cyril, a Black Lives Matter activist and founder of the Center for Media Justice, “If my mother were alive, she would remind me that a government that has enslaved Africans and sold their children will just as quickly criminalize immigrant parents and hold their children hostage, and call Muslim, Arab, and South Asian children terrorists to bomb them out of being. She would remind me that undermining the civil and human rights of Black communities is history’s extreme arc, an arc that can still be bent in the direction of justice by the same bodies being monitored now. The only remedy is the new and growing movement that is us, and we demand not to be watched but to be seen.”


Further Learning: 

AI, Ain’t I A Woman? by Joy Buolamwini

Accuracy of Facial Recognition Chart – The Gender Shades Project

Project Green Light Detroit Example

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: