Recommended Sponsor Painted-Moon.com - Buy Original Artwork Directly from the Artist

Source: Amnesty International NZ

  • In the context of racially discriminatory policing and racial profiling of Black people the use of Facial Recognition Technology (FRT) could exacerbate human rights violations by police, while also undermining the right to peaceful protest and the right to privacy.
  • Law enforcement is violating people’s human rights daily out on the streets. We need police to fulfil their obligations to ensure the human right to protest killings by police and the right of journalists to cover them. Law enforcement has a responsibility to facilitate the right to peaceful protest – not quash it.
  • We are proud to stand with organizations like the Algorithmic Justice League, the ACLU, the Electronic Frontier Foundation and others who have highlighted the dangers of FRT. Amnesty calls for a ban on the use, development, production, sale and export of facial recognition technology for mass surveillance purposes by the police and other state agencies.

Facial recognition technology (FRT) is an umbrella term that is used to describe a suite of applications that perform a specific task using a human face to verify or identify an individual. FRT can create a means to identify and categorize people at scale based on their physical features, including observations or inferences of protected characteristics – for example, race, ethnicity, gender, age, disability status.

This technology has seen a huge uptake in recent years – particularly in the realm of law enforcement. For instance, FRT company Clearview AI claims to work with over 600 law enforcement agencies in the US alone. Other FRT companies such as Dataworks Plus also sell their systems to police departments across the country.

We are seeing this play out daily in the United States, where police departments across the country are using FRT to identify protesters. 

The use of FRT by police violates human rights in a number of different ways. First, in the context of racially discriminatory policing and racial profiling of Black people, the use of FRT could exacerbate human rights violations by police in their targeting of Black communities. Research has consistently found that FRT systems process some faces more accurately than others, depending on key characteristics including skin color, ethnicity and gender. For instance, the National Institute of Standards and Technology (NIST) measured the effects of race, age and sex on leading FRT systems used in the US – according to Dr Charles H. Romine, the Director of NIST, “the study measured higher false positives rates in women, African Americans, and particularly in African American women”.

Further, researchers at Georgetown University warn that FRT “will disproportionately affect African Americans”, in large part because there are significantly more black faces on US police watchlists than white faces. “Police face recognition systems do not only perform worse on African Americans; African Americans also more likely to be enrolled in those systems and be subject to their processing” (‘The Perpetual Line-Up: Unregulated Police Face Recognition in America‘, Clare Garvie, Alvaro Bedoya, Jonathan Frankle, Center on Privacy & Technology at Georgetown Law, Georgetown University, Washington DC (2016).

Second, where FRT is used for identification and mass surveillance, “solving” the accuracy rate problem and improving accuracy rates for already marginalised or disadvantaged groups does not address the impact of FRT on both the right to peaceful protest and the right to privacy. For instance, Black people already experience disproportionate interference with privacy and other rights, and ‘improving’ accuracy may only amount to increasing surveillance and disempowerment of an already disadvantaged community.

FRT entails widespread bulk monitoring, collection, storage, analysis or other use of material and collection of sensitive personal data (biometric data) without individualized reasonable suspicion of criminal wrongdoing – which amounts to indiscriminate mass surveillance. Amnesty International believes that indiscriminate mass surveillance is never a proportionate interference with the rights to privacy, freedom of expression, freedom of association and of peaceful assembly.

States must also respect, protect and fulfil the right to peaceful assembly without discrimination. The right to peacefully assemble is fundamental not only as a means of political expression but also to safeguard other rights. Peaceful protests are a fundamental aspect of a vibrant society, and states should recognize the positive role of peaceful protest in strengthening human rights.

It is often the ability to be part of an anonymous crowd that allows many people to participate in peaceful assemblies. As UN Special Rapporteur on the Promotion and Protection of the Right to Freedom of Opinion and Expression David Kaye has stated: “In environments subject to rampant illicit surveillance, the targeted communities know of or suspect such attempts at surveillance, which in turn shapes and restricts their capacity to exercise rights to freedom of expression [and] association”. 

Therefore, just as the mere threat of surveillance creates a chilling effect on the free expression of people’s online activities, the use of facial recognition technology will deter people from freely attending peaceful assemblies in public spaces.

A wave of local legislation in 2019 has brought restrictions on FRT use in law enforcement to numerous US cities, including San Francisco and Oakland in California, and Somerville and Brookline in MassachusettsSan Diego has suspended law enforcement use of FRT starting January 2020. Portland, Oregon, is currently considering a progressive ban on use by both state and private actors. Lawmakers in Massachusetts are meanwhile debating a state-wide bans on government use of FRT.

Amnesty is calling for a ban on the use, development, production, sale and export of facial recognition technology for mass surveillance purposes by the police and other state agencies. We are proud to stand with organizations like the Algorithmic Justice League, the ACLU, the Electronic Frontier Foundation and others who have highlighted the dangers of FRT.

MIL OSI