Using Your Face Against You: Facial Recognition and Oppressive Surveillance in America
Updated: Jan 25
By Meredyth Dwyer
Facial recognition is quickly becoming a standard feature of daily life. From unlocking cellphones to speeding up airport security lines, facial recognition can be a useful tool for consumers. However, facial recognition remains mostly unregulated, which has allowed governments, businesses, and law enforcement to increasingly use the technology to surveil and exploit citizens without their knowledge.
To identify individuals, facial recognition software uses biometrics to map facial features and capture a “facial signature.” The facial signature, which is unique to each person, is then compared to a database of known faces to identify the individual (Martin 2019). Though this appears to be a straightforward process, there are several concerns regarding facial recognition systems, particularly regarding coded racial bias.
As Wired’s Tom Simonite reported, top-performing facial analysis software from companies such as Amazon, Google, and IBM, misidentifies Black faces five to ten times more than white faces (Simonite 2019). White males have the lowest false match rate (FMR) of about 0.8 percent, whereas Black women have the highest FMR of 34.7 percent (Hardesty 2018). This data indicates an anti-Black racial bias coded into facial recognition software, making it more likely for Black individuals to be misidentified than white individuals.
Facial recognition’s potential for inaccurate identification is especially problematic in the context of law enforcement and government surveillance. For example, the Los Angeles County Police Department uses a mugshot facial recognition system to identify suspects. Housed within this facial analysis software are roughly twelve million mugshots. However, its matching algorithm has consistently shown to be less accurate at identifying women and people of colour (Simonite 2019). This inaccuracy is a common issue across American police forces that utilize facial recognition. Recently, Robert Williams, a Black Detroit-native, was wrongfully accused of shoplifting when he was misidentified by the Detroit Police’s facial recognition system (Rahal 2020).
Law enforcement used facial recognition to target protesters who had attended the Black Lives Matter protests across the United States during the Summer of 2020. In New York, the NYPD used facial recognition technology to track Derrick Ingram, a protester accused of assault after allegedly yelling in an officers’ face with a bullhorn (Vincent 2020). In 2015, Baltimore police similarly utilized facial recognition to identify and locate protesters following the police killing of Freddie Gray Jr., a young Black man who had died in police custody from spinal injuries (Devich-Cyril 2020). American law enforcement agencies use facial recognition to surveil lawful protests, leading Black Lives Matter activists and organizers to encourage protesters to cover their faces to prevent facial recognition from being used against them.
By 2024, the American market for facial recognition will experience a predicted growth from $3.2 billion to $7 billion by 2024 (Martin 2019). Despite this exponential economic development, facial recognition technology continues to lack regulation in America. The lack of guidelines is especially troubling as market-growth for facial recognition software is expected to occur primarily in the surveillance and marketing sectors (Martin 2019). There are currently no U.S. federal laws addressing commercial uses of facial recognition, though individual State legislatures have implemented laws to protect American consumers’ biometric data (Greenberg). Yet, it is clear that without comprehensive regulatory frameworks, facial recognition can and will likely be used as a tool for oppression in the United States.
Over the course of 2020, there has been a significant global increase in bills that address the compiling and use of biometric and facial recognition data (Greenberg 2020). Nevertheless, facial recognition continues to be a tool for the unjust treatment and surveillance of individuals in the United States by its government, police forces, and Big Data firms, requiring substantial changes to current regulations that will protect civilians from biased surveillance.
In June 2020, IBM, Microsoft, and Amazon announced that they would not be selling facial recognition products or services to local and state law enforcement until new federal regulations were passed (Devich-Cyril 2020). These federal laws have yet to pass. Presently, the only states with legislation regulating facial recognition are California, Illinois, and Washington, yet these laws only target commercial uses of the technology (Greenberg 2020). Indeed, America requires comprehensive legislation surrounding law enforcement and the government’s use of facial recognition to surveil citizens through racially-biased technology.
Devich-Cyril, Malkia. “Defund Facial Recognition.” July 5, 2020. https://www.theatlantic.com/technology/archive/2020/07/defund-facial-recognition/613771/.
Greenberg, Pam. “Spotlight | Facial Recognition Gaining Measured Acceptance.” National Conference of State Legislatures, September 18, 2020. https://www.ncsl.org/research/telecommunications-and-information-technology/facial-recognition-gaining-measured-acceptance-magazine2020.aspx#:~:text=No%20federal%20laws%20address%20commercial,protections%20in%20place%20for%20consumers.&text=Other%20state%20laws%20require%20businesses,including%20biometric%20data%2C%20they%20hold.
Hardesty, Frank. “Study finds gender and skin-type bias in commercial artificial intelligence systems.” MIT News. February 11, 2018. https://news.mit.edu/2018/study-finds-gender-skin-type-bias-artificial-intelligence-systems-0212.
Martin, Nicole. “The Major Concerns Around Facial Recognition Technology.” Forbes, September 25, 2019. https://www.forbes.com/sites/nicolemartin1/2019/09/25/the-major-concerns-around-facial-recognition-technology/?sh=47a1486e4fe3.
Rahal, Sarah, and Mark Hicks. “Detroit police work to expunge record of man wrongfully accused with facial recognition.” The Detroit News. June 26, 2020. https://www.detroitnews.com/story/news/local/detroit-city/2020/06/26/detroit-police-clear-record-man-wrongfully-accused-facial-recognition-software/3259651001/.
Simonite, Tom. “The Best Algorithms Struggle to Recognize Black Faces Equally.” Wired, July 22, 2019. https://www.wired.com/story/best-algorithms-struggle-recognize-black-faces-equally/.
Vincent, James. “NYPD used facial recognition to track down Black Lives Matter activist.” August 18, 2020. https://www.theverge.com/2020/8/18/21373316/nypd-facial-recognition-black-lives-matter-activist-derrick-ingram.