Amazon suspends police use of its facial recognition technology for one year

But no mention of use by intelligence agencies, the military and other law enforcement bodies

Amazon announced on Wednesday that it is pausing police use of Rekognition, the company's facial recognition software, for one year.

In a statement, the e-commerce giant said that it is implementing a "one-year moratorium on police use of Rekognition" although it will "continue to allow organisations like Thorn, the International Center for Missing and Exploited Children and Marinus Analytics to use Amazon Rekognition to help rescue human trafficking victims and reunite missing children with their families."

The company made no mention of use by intelligence agencies, the military and other law enforcement bodies.

Amazon added that it backs the call for governments to introduce stronger regulations to govern the ethical use of facial recognition technology, and hopes that the one-year moratorium will give the US Congress ample time to bring in appropriate rules regarding the use of the technology.

Amazon's decision follows IBM's announcement earlier this week that the company would no longer develop or sell facial recognition software over concerns that the technology could be used to promote racial injustice and discrimination.

Amazon has also expressed its support recently for the Black Lives Matter movement, which calls for police reform to bring an end to the inequitable treatment of black people in the US.

The death of George Floyd, a black man who died in police custody in Minneapolis last month, has fanned concerns that facial recognition technology would be used unfairly against protesters.

Many people have criticised Amazon for being hypocritical as it continues to sell its facial recognition products to police forces. Some civil liberties activists also described the Amazon's latest decision as just a public relations stunt.

The American Civil Liberties Union Foundations of California (ACLU) said yesterday that Amazon must completely stop selling its face recognition software until the dangers of the technology are fully addressed.

Facial recognition systems, like many forms of artificial intelligence (AI), have a long history of racial bias. A number of studies in recent years have shown that most algorithms are more likely to incorrectly identify the faces of black people and other minorities than those of white people.

Last year, a study by researchers from MIT Media Lab and University of Toronto claimed that facial recognition software often give inaccurate results and often mistakes dark-skinned females for dark-skinned males.

The study also highlighted the issue of racial and gender bias in Amazon's Rekognition software, which scored only 68.6 per cent on accuracy when identifying dark-skinned females.

While Amazon rejected the findings, a group of experts backed the study and asked Amazon to stop selling the software to law enforcement agencies.

Earlier in 2018, an experiment run by the ACLU showed that Rekognition wrongly matched 28 members of Congress to pictures of people arrested for a crime.

Last year, the UK Information Commissioner's Office (ICO) issued a warning to police over the use of live facial recognition and also called for a statutory code of practice to be introduced to govern police' use of live facial recognition.