Rick Smith, whose inventions changed the way millions of people understand modern policing, now wants to send them to war.
Smith invented the Taser, the stun gun that is often the first thing police officers reach for when things get tense. As public concern mounted that cops were maybe a bit too eager to tase people, Smith invented the police-worn body camera, which has become a staple of U.S. police departments and plays a starring role in our national conversation about police reform.
So what’s next? Smith says AI and robotics will dramatically change how police departments do what they do. They could also reshape the American way of war.
Smith’s company, Axon, is already using machine learning on body camera footage. The company has access to huge amounts of body-camera video because police departments pay Axon to host it on Microsoft Azure. “Basically every big department you can think of, NYPD, LA, Chicago, D.C., we host all their data in the cloud for them,” Smith said during the recent AUSA conference in Washington, D.C.
Axon’s specific AI uses right now are limited, said Smith. The biggest area is instantaneous translation of spoken words into text. In Australia, for instance, he says that police routinely use the instant-transcribe feature while investigating domestic abuse. Previously, police might take down a statement and then tell the victim to come to the police department the next day for a formal interview. That can give the abuser enough time to intimidate the victim into not telling her or his story. With the instant transcription feature, Smith said, “They do [the interview] right there on the scene. They just interview them on camera. We transcribe it and then we’ve got a transcribed version of the video available nearly instantaneously.”
Taking a cue from the military, which has deployed AI to scan video footage and tip a human analyst when something interesting shows up, Axon has a program that will also analyze footage to help police departments understand what happens when an officer pulls a weapon. That’s making departments’ regular audits of officer conduct more efficient, said Smith.
“If you’re a sergeant, and you have 10 officers that report to you, up until last year, they would do random video [analysis] on it. It’s like the random TSA audit. You pick some videos; you check them out and you see what’s happening. Well, it’s pretty inefficient because in most police videos nothing interesting happens. It’s just standard interaction. So now what we’re doing is we’re doing intelligent video selection,” he said. “Anytime the gun comes out of the holster or the Taser gets armed, those signals are processed on the camera. We can mark those videos: ‘You might wanna have a supervisor look at this one because the gun came out of the holster.’ That means it’s probably an elevated situation. Or we can do it based on transcript analysis. So basically anytime there’s swear words or other indications of some form of brewing conflict, we can mark those videos for review. It’s massively…helping them be more efficient in finding and reviewing the incidents that matter.”
But Smith isn’t in a rush to deploy every AI tool on the Axon data set. For instance, the company has a public stance against using facial recognition for police body cameras in the United States, a decision Smith reached after establishing an AI ethics board roughly three years ago.
“We have law enforcement leaders and civil liberties folks on this board. And during the interactions, they were really pushing us to say, ‘Okay, what is the actual use case [for facial recognition]? What do you use this for? Is the technology reliable enough yet? Is it—candidly—legal to be scanning the faces in tracking the whereabouts of every American citizen?'”
It’s one of those technology areas that “sounded sexy at first,” Smith said. But the more he dug into the use cases and the thorny legal issues and other considerations. the less appealing it became.
Smith now has his sights set on the U.S. military market, which is about 10 percent of his business and growing rapidly. The company just inked a deal with the U.S. National Guard. Now Smith is looking to deployed infantry or even special operations teams.
Those groups are governed by different sets of rules and regulations; for instance, special operations teams collect biometric data like face scans that U.S. police departments do not. Smith pointed out that the Army, in particular, is looking to collect more data from the battlefield, particularly from soldiers, to help commanders better understand rapidly changing dynamics and accelerate operations.
But he said the U.S. military could be doing more with non-lethal technologies like Tasers.
Historically, non-lethal arms haven’t much appealed to militaries. Then-Defense Secretary James Mattis used “lethality” as a marker for success; improving soldier “lethality” was one his top goals. The Army even has an entire cross-functional team devoted to soldier “lethality.”
Smith points out that what may have worked to intimidate enemy militaries also alienates the civilians that the military is interacting with more and more, especially in urban environments. If you can use non-lethal force to achieve the same suppressive effect, it could make it a lot easier to operate in more environments, he says. “We need the idea to gain traction, so that, you know, we can start to get some research and development investment.”
Future breakthroughs in robotics will hopefully make Tasers and other non-lethal technologies easier to deploy in combat settings, he said. Deployment on drones, for instance, would give the Taser higher range, beyond the current 25-feet or so.
But before Axon sends Taser drones out on U.S. streets, Smith says that the company needs to be very clear about the “laws of robotics” that will guide the way. The company is in the process of crafting those. Once those are in place, he feels that Taser drones are maybe three years away.
Smith says that the first “law” will set the tone for everything the company does in that area: an authenticated human has to be in control of the decision to fire.
“One of the concerns people have is if we gamify this, then police or military or just going to be excessively using it. Well, I think that’s a legitimate concern, but it’s solvable. And the way you do that is through greater scrutiny and oversight made.”
He points out that drone operators already pilot both small and large drones with the help of streaming video, so it would be simple to record incidents of drone-deployed Tasers for oversight or make it so that a commanding officer has to sign off on the use.
A future where drones tase people sounds a bit dystopian. But Smith says that sending drones into a crowded house is a lot safer than sending in guys with guns. “Look, there’s going to be public discomfort because it’s new and you know, people love to, like, scream and cry about things that are new. I just like to bring it back to the now: The now is you’re talking about a bunch of guys surging on adrenaline running through the door with lethal weapons or finger on the trigger.”
The ability to deploy non-lethal force will also matter more in future soldiering, especially as the U.S. military looks to strike targets in places where there are fewer troops on the ground to provide intelligence or guidance—so-called over-the-horizon operations.
“Look at that guy with the vehicle in Kabul that we just killed,” said Smith, referring to the September incident where the U.S. military unknowingly killed an Afghan aid worker during a chaotic evacuation. “You know, it is perfectly reasonable that, with a little bit of effort, we can develop systems that could take that guy down and even robotically apprehend him.”
© 2021 Government Executive Media Group LLC
Distributed by Tribune Content Agency, LLC.