Navigation
Join our brand new verified AMN Telegram channel and get important news uncensored!
  •  

Police won’t be using IBM, Amazon facial recognition technology for now as companies either pause or suspend its use

A VeriScan facial recognition tablet takes photo of airline passenger at Dulles International Airport in Dulles, Va., Sept. 6, 2018. (U.S. Customs and Border Protection/ Flickr)

Facial recognition technology, a controversial piece of software criticized for its inaccuracies and potential to violate people’s civil liberties, will not be used by Amazon or IBM any time soon, the companies said this week.

Amazon Rekognition, the company’s face identification technology that launched in 2016 and has been sold to multiple U.S. government agencies, will not be used by law enforcement for at least a year, the business said in a statement.

The company’s one-year moratorium on the software’s use by police does not apply to organizations like Thorn, the International Center for Missing and Exploited Children and Marinus Analytics. Those groups can use the software to help rescue human trafficking victims and reunite missing children with their families, according to Amazon’s statement.

Technology that uses artificial intelligence to identify people based on their facial features, using databases of photographs to do so, has been condemned by activists, public officials and even some police officers for its capacity to intrude on personal privacies.

Opponents of the software have also claimed it has been abused by totalitarian governments like China and frequently misidentifies individuals, especially people of color.

Despite concerns surrounding facial recognition and other forms of biometric surveillance, the technology remains largely unregulated at the state and national levels. However, activists and technology business leaders have noted it is likely Congress may restrict law enforcement’s use of facial recognition soon.

“We’ve advocated that governments should put in place stronger regulations to govern the ethical use of facial recognition technology, and in recent days, Congress appears ready to take on this challenge,” Amazon said in its statement. “We hope this one-year moratorium might give Congress enough time to implement appropriate rules, and we stand ready to help if requested.”

IBM was more vocal in its condemnation of the technology. In a letter sent to congressional leaders, CEO Arvind Krishna noted his business no longer offers “general purpose” facial recognition or analysis software.

The U.S.-based computer hardware company firmly opposes using any technology, including facial recognition software, for mass surveillance, racial profiling and the violation of basic human rights and freedoms, Krishna said.

“We believe now is the time to begin a national dialogue on whether and how facial recognition technology should be employed by domestic law enforcement agencies,” the CEO wrote in his letter. “Artificial intelligence is a powerful tool that can help law enforcement keep citizens safe.

“But vendors and users of Al systems have a shared responsibility to ensure that Al is tested for bias, particularity when used in law enforcement, and that such bias testing is audited and reported.”

The two companies’ announcements come amid nationwide outrage over the killing of George Floyd, an unarmed black man who died after a white Minneapolis police officer kneeled on his neck for more than 8 and a half minutes, even after Floyd lost consciousness.

Floyd’s death has sparked hundred of protests throughout the world, many of which have called for an end to systemic racism and an overhaul of the policing model in the United States.

In his letter to Congress, Krishna wrote that the “horrible and tragic” killings of Floyd, as well as Ahmaud Arbery, Breonna Taylor and too many others, are a reminder that the fight against racism remains urgent.

“To that end, IBM would like to work with Congress in pursuit of justice and racial equity, focused initially in three key policy areas: police reform, responsible use of technology, and broadening skills and educational opportunities,” the CEO said.

He suggested that Congress try more police misconduct cases federally, create a national registry of officers’ wrongdoing and make modifications to a Supreme Court-issued doctrine called qualified immunity that shields governmental officials, like police officers, from being sued for discretionary actions performed on the job.

The CEO also urged congressional leaders to increase transparency and help police protect communities by legislating technologies like body cameras, modern data analytics techniques and facial recognition.

To date, no state has passed a law banning the software, though a bill in the Massachusetts State House remains under consideration that, if passed, would place a moratorium on government agencies’ use of facial recognition technology.

As state and federal legislation governing the software’s use is lacking, many towns and cities throughout the country have taken it upon themselves to pass municipal bans of the technology.

In Massachusetts, such restrictions have been signed into law in BrooklineCambridgeNorthamptonSomerville and, most recently, Springfield. City councilors in Boston and Easthampton are also considering banning the software.

The majority of the communities in the commonwealth that passed ordinances prohibiting the use of facial recognition did so in collaboration with the American Civil Liberties Union of Massachusetts, which started a campaign last summer called “Press Pause on Face Surveillance” to bring awareness to the issue.

The group has been critical of Amazon Rekognition and conducted a test last year of the facial recognition software to identify 188 New England athletes. The test misidentified 28 of the players, matching them to mugshots in an arrest photograph database.

The digital rights group Fight for the Future has also lambasted Amazon’s technology. The organization, which was founded in Worcester but is based now in Boston, has fought for facial recognition to be completely restricted on college campuses and elsewhere.

In a statement Wednesday, the group called Amazon’s one-year moratorium on its facial recognition software “nothing more than a public relations stunt” and a sign that the technology has become “increasingly politically toxic.”

“Amazon knows that facial recognition software is dangerous. They know it’s the perfect tool for tyranny. They know it’s racist – and that in the hands of police it will simply exacerbate systemic discrimination in our criminal justice system,” Evan Greer, deputy director of Fight for the Future, wrote in her statement.

___

© 2020 MassLive.com