Navigation
Join our brand new verified AMN Telegram channel and get important news uncensored!
  •  

Santa Cruz becomes first US city to ban predictive policing

Then-Commissioner of the New York Police Department William Bratton speaks at 2016 Concordia Summit on Sept. 20, 2016 in New York City. (Paul Morigi/Getty Images/TNS)

Nearly a decade ago, Santa Cruz was among the first cities in the U.S. to adopt predictive policing. This week, the California city became the first in the country to ban the policy.

In a unanimous decision Tuesday, the City Council passed an ordinance that banishes the use of data to predict where crimes may occur and also barred the city from using facial recognition software.

In recent years, both predictive policing and facial recognition technology have been criticized as racially prejudiced, often contributing to increased patrols in Black or brown neighborhoods or false accusations against people of color.

Predictive policing uses algorithms that encourage officers to patrol locations identified as high-crime based on victim reports. The software “replicates and supercharges bias in policing by sending police to places that they’ve policed before — that is often going to be Black and brown communities,” said Matt Cagle, a technology and civil liberties attorney with the American Civil Liberties Union of Northern California.

The Santa Cruz Police Department, which began predictive policing with a pilot project in 2011, had put a moratorium on the practice in 2017 when Andy Mills started as police chief. The new city ordinance bans the practice permanently.

Mills said predictive policing could have been effective if it had been used to work with community members to solve problems — that didn’t happen. Instead, the policy was used “to do purely enforcement,” leading to unavoidable conflicts, he said.

“You try different things and learn later as you look back retrospectively,” Mills said. “You say, ‘Jeez, that was a blind spot I didn’t see.’ I think one of the ways we can prevent that in the future is sitting down with community members and saying, ‘Here’s what we are interested in using. Give us your take on it. What are your concerns?’ ”

Like predictive policing, facial recognition systems have also come under fire, with critics arguing they can show bias toward faces with similar characteristics to the ones used to create the technology, particularly based on racial makeup.

A Santa Cruz City Council report said that “despite purported technological advances, a recent National Institute of Standards and Technology study found that some forms of face recognition technology were 100 times more likely to misidentify people of African and Asian descent.”

While Santa Cruz is the first city in the nation to ban predictive policing, it follows other cities that have banned facial recognition technology — notably San Francisco in May and Oakland earlier this month.

In a recent test, facial recognition software incorrectly matched 26 California legislators with mug shots of people who had been arrested. California is considering banning such software from being used with police body cameras.

The City Council also voted Tuesday to evaluate additional police reforms following the Minneapolis killing of George Floyd.

“We’re really taking this situation seriously, and we are trying to be proactive in continuing this momentum toward actual systemic change,” Mayor Justin Cummings said during the meeting.

Cummings said he hopes to “hear from communities that are the most impacted and understand how we can support those communities moving forward.”

Nearly 400 local community members had signed an ACLU petition calling for the predictive policing ban, and a host of organizations also backed the ordinance — the NAACP Santa Cruz Chapter, American Friends Service Committee and Asian Americans Advancing Justice, to name a few.

Cagle said he didn’t hear a single speaker oppose the initiative before the long-anticipated ordinance passed.

Even Santa Cruz-based predictive policing company PredPol, whose software was used by the Santa Cruz Police Department until 2017, backed the new rules.

“Any government agency that applies technology to its operations should have a process to ensure that it does not result in racially inequitable outcomes,” PredPol CEO Brian MacDonald wrote in an email to The Times. The company is confident its software is not racially biased, MacDonald said, and therefore meets the conditions in the city’s ordinance.

With widespread support and research backing it, Cagle doubts the policy could ever be reversed.

“Santa Cruz’s ban is ironclad, and it requires new public legislation and finding that the technology cannot perpetuate bias” if such technology were to ever be used again, he said.

Southern California has been the home of predictive policing since the late 2000s. The concept is the brain child of former LAPD Chief Bill Bratton, who wanted to determine whether past data could predict future crime locations.

As police departments and universities tweaked the model, they determined their map could forecast crime types, locations and times. In 2012, software developing business PredPol chose Santa Cruz as its home, citing the local success of predictive policing.

Since then, some studies have shown the policy reduces crime, but others have found it to have a negligible impact.

While civil liberties organizations flag predictive policing as racially biased, MacDonald said he stands by PredPol’s algorithm.

“If the command staff provides no explicit guidance, officers have to rely on their ‘judgment’ or ‘intuition’ as to where to patrol. Human judgment is of course prone to error and subject to bias, whether conscious or subconscious,” he wrote.

The LAPD announced in October that it would make changes to its PredPol program, including the creation of a data-driven policing unit to oversee all crime-fighting strategies and seeking input from community groups before implementing new data programs.

The department also said it would develop a system to provide periodic reports about data programs and outcomes with statistics on people and locations targeted for intervention.

The changes were made seven months after an inspector general couldn’t determine whether the LAPD’s predictive-policing program helped reduce crime.

Though MacDonald maintains that well-formulated algorithms can mitigate racial bias in policing, Cagle questions the policy.

“Private for-profit companies shouldn’t be dictating policing in any American community,” he said. “Banning facial recognition won’t dismantle the racism and bias that pervades policing in America, but it does take away a system that we know will further exacerbate that problem.”

___

© 2020 Los Angeles Times

Distributed by Tribune Content Agency, LLC.