Unbeknownst to the mayor and most city leaders, Norfolk police last year quietly started using a controversial facial recognition program that could end people’s ability to anonymously walk around in public.
Detectives were so impressed with how the technology identified unknown suspects and helped solve crimes that they pushed the top brass to shell out thousands of dollars a year to make facial recognition one of their permanent crime fighting tools.
Success seemed imminent.
But Chief Larry Boone nixed their plan to have Norfolk police pay to use Clearview AI, an app made by a tech startup of the same name that’s been aggressively marketing its services to law enforcement. Boone told The Virginian-Pilot in a June interview that the public needed to know and talk about such a hot-button issue before police added it permanently to its investigative repertoire.
Norfolk police stopped using the app in February, he said.
“That can be perceived to be so intrusive — Big Brother is watching,” Boone said during the interview. “Our current society isn’t quite ready for that.
“They don’t trust it.”
Even though Norfolk police have stopped using Clearview’s facial recognition program, questions remain about the use of such technology to investigate crimes, whether in Virginia or across the country. There’s little oversight, legal or scientific, of how our images can be collected and used — even when they could help send people to prison.
Investigators in the nation’s roughly 18,000 law enforcement agencies are often able to unilaterally adopt whatever investigative tools they want, whether or not they’ve been rigorously tested by scientists or other law enforcement professionals.
Five of eight Norfolk city council members said they didn’t know police were using Clearview until a Pilot reporter told them: Mayor Kenny Alexander, Andria McClellan, Mamie Johnson, Paul Riddick and Tommy Smigiel. Councilwoman Angelia Williams Graves said she knew the department was using Clearview, but couldn’t remember when she heard about it, or from whom. Council members Martin Thomas Jr. and Courtney Doyle didn’t respond to several voicemails, texts, and emails sent over the past five months.
In the three months they used Clearview, Norfolk gang detectives uploaded photos of unknown people and suspects to the company’s app, which then spit out public photos of those people, along with links to where those photos appeared. The system — powered by a database of more than three billion images that Clearview claims to have scraped from Facebook, YouTube, Venmo and millions of other websites — goes far beyond anything ever built by the United States government or Silicon Valley heavyweights, according to a Jan. 18 New York Times story headlined “The Secretive Company That Might End Privacy as We Know It.”
The Norfolk Police Department was one of more than 600 law enforcement agencies that, without public scrutiny, started using Clearview between Jan. 1, 2019, and January of this year, according to The Times story. Clearview declined to provide a list of those agencies to The Times for its story earlier this year, and Jessica Medeiros Garrison, a Clearview sales representative, didn’t answer several emails and texts sent by The Pilot over the past two weeks requesting an interview and asking how many law enforcement agencies are using the company’s program.
The six municipal police departments in Hampton Roads and the Virginia State Police told The Pilot they haven’t used Clearview and have no plans to do so.
___
About face
The genesis of Clearview was a 2016 meeting at a conservative think tank between Hoan Ton-That, a tech entrepreneur in his early 30s, and Richard Schwartz, who was an aide to Rudy Giuliani when he was mayor of New York, according to The Times expose. The two agreed to start a facial recognition company. Ton-That would develop the technology; Schwartz would tap his Rolodex of conservative powerbrokers to drum up business.
The result: a system that uses what Ton-That described to The Times as a “state-of-the-art neural net” that converts all images into mathematical formulas, or vectors, based on facial measurements — like how far apart a person’s eyes are. The result is an individual’s “faceprint.”
Clearview created a directory that grouped photos with similar vectors into “neighborhoods.” When a user uploads a photo of someone’s face into Clearview’s system, it shows all the scraped photos in the same “neighborhood” — in other words, the ones the app thinks are likely of the same person. It also links to the sites from which those images came, which frequently gives a user the person’s name.
The technology is complicated, but the idea is simple: Plug in a photo of a person you want to identify, and the app spits out any photo of that person that’s ever been publicly uploaded to the web.
But who would buy their product? Early on, the company struggled to settle on a market. Pitch it to parents as a way to vet babysitters? Hotels where concierges could greet guests by name as they arrived?
Eventually, they homed in on law enforcement, according to The Times article. In February 2019, the Indiana State Police became one of the first agencies to use the app and solved a case within 20 minutes of using it — a shooting in a park where a bystander had taken a video of the crime. Police ran a still of the shooter’s face through the app and — boom — a match to a video on social media with a name in the caption. Clearview had given them a suspect.
Meanwhile, the company started an aggressive promotional push, trying to get police departments around the country to try the app. In Norfolk, gang detectives got a promotional email in November from Police One, an online news site geared toward law enforcement officers. It was a Clearview advertisement that offered to let investigators “solve crimes instantly and save lives” with a computer program that’s “like Google Search for faces.”
The ad told the detectives that Clearview had let law enforcement break up a online child pornography ring and rescue a 7-year-old girl, crack a 32-year-old unsolved homicide, bust a multimillion dollar credit card scam and “solve hundreds of other cases across the country.”
And then the hook: “Try it out for free right now” with a link to a few questions to verify they were law enforcement. It was Clearview’s most effective sales technique: offering 30-day free trials to officers, who would then encourage their superiors to pay for annual licenses once they saw how well it worked.
The two detectives signed up.
Over the next three months, Norfolk police used Clearview in about 20 investigations, and it helped them make nine arrests, Boone said.
But facial recognition was never the only evidence against a suspect, the chief added. Feeding Clearview a photo of someone might give you a name, but that just helps you start your investigation; it’s not the end of it, he said. That name leads detectives to, say, check the suspect’s criminal record; scour their Facebook page; or interview them, their friends, or family — whatever the case requires.
“You still go on doing your old-fashioned investigative police work,” Boone said.
On Jan. 14, after the Norfolk detectives had been using the app for two months, Clearview sales rep Garrison told them their trial period was ending and asked if they wanted to buy an annual subscription, according to emails obtained by The Pilot through the state’s public records law.
One of the gang detectives, Officer Toofan Shahsiah, replied to Garrison that same morning: “So far, the app has been VERY helpful,” he said, adding that he thought superiors wanted to keep using Clearview. Garrison extended the department’s trial accounts for another 30 days.
On Jan. 27, Garrison checked back in. Sgt. Brandon Goins said he was about to write a procurement letter and send it up through his chain of command to the police chief for approval. He told her he was recommending the department buy two subscriptions, for a total of $4,000 a year.
A week and a half later, the deal wasn’t done. Garrison poked Goins again. He told her he was still “waiting on a final word,” and that he thought “the inquisition from the news reporter is slowing things up.”
The Pilot had asked about the department’s use of facial recognition technology on Jan. 24. Upon learning Norfolk detectives were using Clearview, a Pilot reporter requested a demonstration of how the program worked. Pickering then emailed Garrison, asking if she would explain it. After she replied, “[i]t’s been our experience with law-enforcement that they don’t particularly share their investigative techniques with the media,” Pickering rejected the reporter’s request. On Feb. 13, The Pilot used the state’s public records law to request police department emails that mentioned “Clearview.”
Final word eventually came back to Goins: a no go. Boone did not sign off on buying the Clearview subscriptions, and in fact, told his assistant chief to make sure no one in the department was using it, even on a trial basis. The Pilot’s “inquisition” didn’t factor into the decision to stop using Clearview, Boone said.
But top-down orders aren’t always followed by the rank-and-file. The Raleigh Police Department was a paying client before cutting ties with Clearview and banning officers from using its program, according to a Feb. 27 BuzzFeed News story. Despite that, Clearview records obtained by BuzzFeed News revealed Raleigh police officers kept using the app and even signed up for new free trials after the moratorium took effect.
___
Face plant
Police departments have had access to facial recognition for nearly 20 years, but they have usually been limited to searching government-owned images, like mugshots and driver’s license photos. Those pictures are easier to analyze because they’re more uniform — they have similar lighting, backgrounds and people are directed to look right at the camera.
In fact, the Virginia Beach Police Department was one of the first law enforcement agencies in the country to try out the technology in the early- to mid-2000s.
The experiment, funded by a $150,000 state grant and $50,000 in local taxpayer money, was plagued by the scope of its database, but also the nascent technology. The vendor claimed a 99.3% accuracy rate in laboratory tests under ideal lighting, but the software proved impotent out in the wild, homing in on what it thought were faces but turned out to be pictures on a T-shirt, leaves on the ground, and even the breasts and navels of bikini-clad women.
Closed-circuit cameras watched people walking down a four-block stretch of the Oceanfront as software scanned their faces in real time. Officers at a computer would check out hits and, if need be, radio to an officer on the ground to track down potential suspects to verify their identities.
Police in Tampa, Florida, tried something similar. Both cities shut down their experiments years later without making any arrests.
There were key differences between what Virginia Beach did and what many police departments are doing with Clearview, aside from 18 years of technological advances.
Virginia Beach police compared images of thousands of people who walked at the Oceanfront to a small set of images in their database — about 700 people, including those with outstanding felony warrants, missing children, and the FBI’s most wanted fugitives and terrorists. Images plugged into Clearview are checked against more than 3 billion photos.
And Virginia Beach police owned and controlled their database, whereas the photos investigators feed into Clearview — which could be of crime victims or witnesses — become a part of a database owned by a private company whose ability to protect its data is untested.
Finally, Virginia Beach had a public debate before using facial recognition, unlike their counterparts today in Norfolk and other cities. Leading up to the city council vote, the department announced its plans, held a town hall, and briefed council members as well as the media. Plus, it created a citizen’s advisory committee that included people from civil rights and minority groups who could randomly audit the system to make sure police were complying with their own policies.
“Above all else, the Department believed that full disclosure was paramount to implementing a successful program,” according to a 2005 report of the General Assembly’s Joint Commission on Technology and Science.
In contrast, Norfolk police didn’t notify council or the public. That’s possible, in part, because Clearview has set up the infrastructure. Detectives only need to have a photo and five minutes to create a Clearview account.
___
Facing questions
After The Times published its January expose, several major tech companies — including Facebook, Google, Twitter and Venmo — demanded in cease-and-desist letters that Clearview stop using photos scraped from their websites. The state of New Jersey banned law enforcement from using it. A host of plaintiffs have filed lawsuits against Clearview in multiple states.
That includes the ACLU, whose lawsuit called Clearview’s app “the nightmare scenario” that could lead to “unwanted tracking and invasive surveillance by making it possible to instantaneously identify everyone at a protest or political rally, a house of worship, a domestic violence shelter, an Alcoholics Anonymous meeting, and more.”
More than anything, it’s the unknowns that are spooking people. And as a small startup company that was virtually unknown until a year ago, there are many when it comes to Clearview.
Most simply, how well does its technology work? In promotional materials, the company claims it works 99% of the time, and in what it called an “internal accuracy test,” the program correctly identified 834 federal and state lawmakers whose images were submitted. But Ton-That told The Times that Clearview accurately identifies someone in submitted photos 75% of the time.
Regardless, it’s unclear how often the tool delivers false matches, because it hasn’t been tested by an independent party like the National Institute of Standards and Technology, a federal agency that rates the effectiveness of facial recognition programs.
Facial recognition technology has always disproportionately misidentified people of color because programmers don’t feed their algorithms enough faces of Black and brown people, said Claire Gastañaga, executive director at ACLU of Virginia. Even with three billion images, there’s no guarantee Clearview has seeded enough photos to overcome this pitfall.
Since Clearview and its proprietary technology control what an officer gets after uploading a photo, could the company manipulate those results?
Kashmir Hill, the reporter who wrote the New York Times story, said a police department ran her photo through Clearview at her request and got no results. But the company soon called those officers and asked if they were talking to the media. Clearview blamed a “software bug” for the lack of search results despite Hill having years’ worth of public photos on social media.
Are your photos in Clearview’s database? Who knows? Clearview is a private company using its own technology and private database, even though it sells its service to government agencies.
Ton-That has told multiple news agencies that Clearview uses only publicly available images. If people change their Facebook privacy settings so search engines can’t link to their profiles, those users’ photos won’t be included in Clearview’s database, Ton-That told The Times.
But if the company has already scraped your images, it’s too late, though Ton-That told The Times the company was working on a tool that will allow people to request their pictures be removed if they’re no longer on the website from which Clearview scraped them.
Who is using Clearview? In February, Ton-That said on Fox Business that Clearview was being used “strictly for law enforcement to do investigations.”
But BuzzFeed News reported in February it had obtained a list of Clearview’s users, which included 2,200 law-enforcement and other government agencies — but also private organizations such as the NBA, Wells Fargo, Bank of America, Albertsons, Rite Aid, Macy’s, Best Buy, Home Depot and Walmart.
Clearview’s logs showed it had also given access to agencies with ties to governments in Saudi Arabia and the United Arab Emirates, where being gay is a crime.
Law-enforcement users in February included local police but also the international police organization Interpol and several U.S. federal agencies: the FBI, the Department of Homeland Security, Customs and Border Patrol and Immigration and Customs Enforcement.
Because Clearview didn’t respond to The Pilot’s questions, it’s unclear if any have stopped using it — or if even more agencies have signed up.
But, according to Hill, The Times reporter, the number of law enforcement agencies logging on and doing searches was on the rise when she first put the company into the public consciousness in January.
“Law enforcement loves it, and…it’s spreading like wildfire,” she said while being interviewed on The Times’ podcast The Daily.
And those are just a few aspects of Clearview that are unknown or unproven, Gastañaga said.
These issues are not new to Clearview or facial recognition technology. More than a decade ago the National Academy of Sciences released a report about the weak science underpinning much of the forensic evidence being used in courtrooms across the country to convict people and send them to prison.
Prosecutors once used bite mark identification as evidence, but that’s been exposed as “junk science,” Gastañaga said. Vetting new methods and techniques — whether it’s body cameras, shot trackers, or red light and speed cameras — takes time and professional analysis. That can’t happen if individual detectives, sometimes without the knowledge of their superiors, pick up a shiny, new toy and just start using it.
“Whatever technology they’re wanting to use to make their job easier, the ACLU believes — deeply and profoundly — that the public should have a role in that conversation,” Gastañaga said.
Dana Schrad, executive director of the Virginia Association of Chiefs of Police, said she didn’t know about Norfolk police using Clearview, and admitted that the technology behind Clearview’s tool is unproven. But facial recognition has the potential to give police unbiased information, eliminating some of the disparities Gastañaga mentioned, she said.
And investigators aren’t automatons blindly reacting to what Clearview, other facial recognition tools, or any single piece of evidence, Schrad added. They put that piece of evidence in context with everything else they have, and do more investigating based on that evidence.
“It’s just one tool,” she said. “It’s not the only tool you use.”
But it can be an effective one that helps police put criminals — sometimes people who hurt others — in prison, she said. Police departments are strapped for cash and struggling to attract people into their ranks. Any tool that lets them fight crime with less money and manpower is a win.
___
Saving face
In Virginia and many other states, lawmakers have set no limits for how government agencies can use facial recognition technology, nor are they talking about the benefits and problems it could bring, Del. Lashrecse Aird said.
“There just aren’t any (guardrails),” the Petersburg Democrat added in a phone interview. “It’s basically the wild wild west out there.”
She and some other lawmakers are trying to change that. Aird introduced a bill in the 2020 General Assembly session — just before The Times’ expose, as it turned out — that would have required lawmakers to study facial recognition and artificial intelligence. The bill died in committee, but the Joint Commission on Technology and Science is studying facial recognition anyway.
And Del. Cliff Hayes, who heads the commission, is also chairing an advisory committee about law enforcement’s use of facial recognition. Hayes, D-Chesapeake, said he’s held off on calling meetings during the legislature’s special session on police reform and the coronavirus, but plans to restart once it’s over. His aim is to hear from tech experts and law enforcement before coming up with recommendations in time for next year’s General Assembly session. Aird said she will “definitely” introduce facial recognition legislation next year, and Hayes said he’s considering it.
Aird linked the issue to police reform and the civil unrest that has erupted since the death of George Floyd, a Black man who died after a white Minneapolis police officer jammed his neck into Floyd’s neck for some eight minutes as Floyd begged him to get off. Before using more powerful tools like facial recognition, law enforcement agencies need to figure out how to use the ones they already have — arresting people, writing citations, using force — without disproportionately affecting people of color, Aird said. If they don’t, officers who start using new technology are just going to fall into grooves well worn by systemic racism, implicit bias, and in some cases, outright bigotry — but with something more powerful and little understood.
“I think that puts more lives at risk. I think it allows for greater discrimination and bias,” she said.
Aird said she didn’t know Norfolk police had been using Clearview until a Pilot reporter told her. Norfolk — with more than 42% of its 244,000 residents being Black and more than half being people of color — is one of the cities where she’s most worried about police using programs like Clearview, given that facial recognition has historically and consistently misidentified non-white people.
Aird said she feels a sense of urgency to provide top-down rules for cities and towns so they can’t fill the vacuum in a “use-now-apologize-later” approach.
“We need these guardrails as soon as possible,” she said. “By the time it becomes a kitchen table conversation, it will be too late to add (them).”
Norfolk’s mayor agrees.
Alexander said he didn’t know the police department had used Clearview until The Pilot told him in early April, and so he was not involved in the decisions to start or stop using it. But, the mayor added, he thinks the police chief made the right call not to use it beyond the three-month trial.
“We found that it was really problematic,” he said. “This is not just a crime fighting tool apparatus. This has the potential of doing much much more.”
Because of that potential, Alexander said it would have been “warranted and appreciated” for police and city officials to present the pros and cons to the City Council before using it. But the mayor stopped short of saying they should have.
For the mayor, many of the problems with Clearview are unanswered questions: How accurate is their program? How has the company secured its database? What happens to the probe photos submitted by Norfolk police into Clearview’s system? Is Clearview using those? How so? Are they deleted after a certain period of time?
State lawmakers are the best people to answer those questions, Alexander said, adding that he’s glad delegates Aird and Hayes are seeking answers. The state legislature needs to take the lead by getting input from tech experts, law enforcement officials, privacy rights groups and others.Then they can transform those answers into legislation that will determine if and how law enforcement can use facial recognition.
Alexander, who served 14 years total in the state Senate and House of Delegates, said he could see lawmakers creating a process through which tech companies like Clearview would have to get a license before offering law enforcement agencies in Virginia their facial recognition products. Only those companies willing to validate their technology, prove their ability to secure sensitive information, and open up to regulators would earn such a license.
“We don’t think that this product has gone through that type of scrutiny, that type of discipline and I’m glad we’re not using it,” he said.
In March 2019, a group of tech experts, business people and law enforcement officials wrote a 24-page about law enforcement using facial recognition. That task force, a joint effort of the International Association of Chiefs of Police and the Integrated Justice Information Systems Institute, recommended police fully inform the public what it’s doing and create policies spelling out who can do what with the technology and when.
Two decades before that report came out, Virginia Beach police spent three years researching facial recognition and winning support from city leaders. Then police crafted a six-page policy outlining what photos could go in the database, what officers were to do when they got a hit of a felony fugitive versus a missing child, and that only officers authorized by the police chief could access the system.
Norfolk police didn’t do any of that.
Boone said the department never wrote any rules for how and when to use facial recognition because they were using Clearview on a trial basis. If the department had fully adopted the program, the top brass would have done so.
The chief said he has no plans to use Clearview or any other facial recognition technology, but didn’t rule it out.
“That is so far down the road,” Boone said. “I hadn’t given it much thought.”
___
© 2020 The Virginian-Pilot
Distributed by Tribune Content Agency, LLC.