Navigation
Join our brand new verified AMN Telegram channel and get important news uncensored!
  •  

Facial recognition bans: What’s next in Oakland, at Amazon and more

An Amazon bookstore in Portland, OR (Steve Morgan/WikiCommons)

Efforts to rein in government use of facial recognition have a big couple of weeks ahead, days after San Francisco approved a first-of-its-kind ban on use of the technology by police and other city agencies.

Across the bay, Oakland is expected to consider a similar ban for the city’s agencies, possibly next week. Up north, Amazon investors will vote Wednesday on a shareholder proposal urging the company to stop selling its Rekognition software to the government. And the same day, the U.S. House Oversight Committee will hold a hearing on the use of face recognition by government agencies.

Facial recognition, already used by police and other government agencies in some major U.S. cities, has sparked a backlash from civil rights advocates and others concerned about the technology’s accuracy and its potential effect on privacy and safety of the public, especially minorities.

“If (face recognition) becomes a tool of ICE and police departments, it could result in a gross violation of human rights,” said Pat Mahoney, a nun at St. Joseph’s in New York who works with the Tri-State Coalition for Responsible Investment. TriCRI, whose member congregations own Amazon stock, is proposing that the internet giant stop selling its Rekognition software to government agencies unless it can prove it won’t harm human and civil rights.

A separate shareholder proposal asks that Amazon study the effects of Rekognition — which last year mistook the faces of members of Congress for mugshots, the ACLU reported — and release a report with the results.

Stan Shikuma, of the Seattle chapter of the Japanese American Citizens League, will present that resolution at the company’s shareholder meeting. During a press call last week, he expressed concern that Rekognition will “exacerbate over-policing of brown, black and yellow people… it’s particularly dangerous given the spread of hate and fear today.”

The ACLU also will be there to urge shareholders to vote on the proposals. In an open letter to shareholders, the group said the company continues to sell its product to police while refusing “to disclose which agencies have purchased it and how they are using it.”

Amazon is recommending shareholders vote against both proposals, saying: “Facial recognition technology significantly reduces the amount of time it takes to identify people or objects in photos and video. This makes it a powerful tool for business purposes, but just as importantly, for law enforcement and government agencies to catch criminals, prevent crime, and find missing people.”

In Oakland, the proposed ban on facial recognition may go before the public safety committee May 28, according to Brian Hofer, the chairman of the Oakland Privacy Advisory Commission who has helped draft ordinances limiting surveillance by public agencies around the Bay Area. After that, it would go to the city council. If passed, the ban would supplement the city’s existing measures to limit its agencies’ use of surveillance technology.

The Oakland Police Department has not returned repeated requests for comment.

These efforts follow a recently passed bill by the California State Assembly, which would ban facial recognition and biometric surveillance technology in body cams. That legislation, authored by Assemblyman Phil Ting, D-San Francisco, awaits approval by the state Senate.

If body cams were to gain facial-recognition capabilities, it would be “the equivalent of requiring every person to carry an ID at all times,” said Matt Cagle, technology and civil liberties attorney with the ACLU of Northern California.

Daniel Castro, vice president at the Washington-based think tank Information Technology & Innovation Foundation, opposes any bans or moratoriums on facial recognition, which he said could hold back the technology. He acknowledged the possible risks of using facial recognition and said “it’s important that police have best practices” and proper oversight. But he said the technology can help with efficiency and public safety.

“We want policies about appropriate use by police,” Castro said. “This should not be about the technology, but how it’s used.”

A report released by Georgetown Law’s Center on Privacy & Technology last week included examples of how facial recognition has done harm. Last month, a U.S. college student at Brown University was mistakenly identified by Sri Lankan authorities as a suspect in a terrorist bombing there, thanks to facial recognition. She said at a news conference that she received death threats as a result. Another example the report cited: In Baltimore, police reportedly used facial recognition to track those who were protesting the death of Freddie Gray — who died while in police custody — in 2015.

———

© 2019 The Mercury News (San Jose, Calif.)

Distributed by Tribune Content Agency, LLC.