Join our brand new verified AMN Telegram channel and get important news uncensored!

Here’s how Google pitched AI tools to special operators last month

Exploitation Analyst airmen assigned to the 41st Intelligence Squadron have begun using advanced mobile desktop training that uses an environment to challenge each individual analyst in cyberspace maneuvers to achieve mission objectives at Fort. George G. Meade, Md. (U.S. Air Force/Staff Sgt. Alexandre Montes)

Not long before Google announced it would end one part of its AI work for  the U.S. military, the company’s Cloud team was pitching artificial intelligence and machine learning tools to members of the U.S. special operations community, the soldiers very much at the front lines of combat in places like Syria, Iraq, Afghanistan, and Africa.

The pitches were part of Google’s sales work at the SOFIC convention, which draws special operators from around the world to Tampa, Florida, each May. A document distributed at a May 24 sales presentation touts Google Cloud and associated AI capabilities as useful for some of special operations forces’ most important work. Defense One obtained a copy of the document.

One sheet describes how the tools can help with sensitive site exploitation, or SSE.

“As part of the Special Operations mission to turn captured enemy material into actionable intelligence, Commands are tasked with collection, exploitation, and dissemination of unclassified material to include documents, images, audio, and video,” it says. The document goes on to describe how Google Cloud’s machine learning APIs, the company’s scalable computing and storage resources and other capabilities can help “accelerate exploitation of valuable unclassified intelligence material.”

A third sheet says the tools make it easy to exploit open source information, or OSINT, while using Google’s global network infrastructure to “blend in with local traffic from your desired Point of Presence.” The document suggests that the special operators would get more than an ordinary customer’s access to software libraries and tools: “Work with Google to tailor the proposed workflow to fit your mission, with full visibility on how the elements work together to streamline analysis.”

Google began to reach out to the special operations community last summer, according to a person with direct knowledge of the effort.

Later, around the beginning of 2018, Google teamed up with a hardware provider named Klas Telecom, a maker of hardened networking modules for use in rugged and austere terrain, a representative from the Irish company said.

Google sales reps did a number of in-house demonstrations and sales pitches, primarily to U.S. military customers in Europe, a training and departure point for many U.S. forces headed to Africa and the Middle East. The representative said that Klas hardware running Google’s cloud had not yet been deployed into combat areas.

All this shows how fiercely Google had been pursuing AI-related contracts for the Pentagon, even as an internal revolt was brewing against them. By mid-May, a dozen employees had quit over the company’s key role in Project Maven, which harnesses AI to speed up the processing of video imagery. Google has tried to cast its work on Maven as benign and “not for offensive purposes.” But its work has already helped the Air Force target terrorists in the Middle East. Moreover, Maven is intended to be a “pathfinder” project, charting a way for Google and the Pentagon to vastly increase their collaboration, whether through a massive Pentagon cloud contract now being competed, or through smaller efforts. The future of that outreach is somewhat in question

SSE and OSINT, by themselves, aren’t exotic or controversial activities. There’s a big difference between analyzing material on a seized laptop and planning or executing a drone strike. The company has insisted, repeatedly and adamantly, that it the artificial intelligence and machine learning capabilities it has provided to the U.S. military through Project Maven play no direct role in designating targets, much less firing weapons at them. Those decisions are very much still made by humans.

But Intelligence gathered and seized from laptops can help designate targets for strikes, as can intelligence gleaned from open sources.

The company’s efforts to draw a  distinction between targeting and developing tools to help targeters has failed to satisfy internal critics who say the company should not work with the Defense Department at all.

Meanwhile, many in the national security community have condemned Google’s decision to not renew its Maven contract. “When companies or individuals are in a position to make a significant and positive contribution to national security, they should take it,” said Gregory Allen, an adjunct fellow at the Center for New American Security who penned an essay on the topic for Nature. “Google deserves credit, not condemnation for their initial willingness to contribute. It is unfortunate that they have withdrawn in the face of unjustified pressure, much of which based on a faulty understanding of what Maven does.”

The military reminds vendors repeatedly that its intelligence analysts are incredibly overworked, that the volume and variety of data that they need to sift through is too high, and that machine learning and that the importance of artificial intelligence is growing in rough correlation to the amount of potentially usable data. Bottom line, the military needs emerging technologies to better do what it does. Sometimes that means hitting human targets with missiles.

On Thursday, Google CEO Sundar Pichai published the company’s much-anticipated list of principles for how it would develop new artificial intelligence capabilities. In a blog post, Pichai writes that the company will not make “weapons or other technologies whose principal purpose or implementation is to cause or directly facilitate injury to people,” nor will it make “technologies that gather or use information for surveillance violating internationally accepted norms.”

Sensitive site exploitation and open source intelligence collection are not weapons, nor are they outside legal norms. In theory, then, the publication of the principles should not affect the company’s outreach to SOF operators. But Pichai goes on to outline what specific areas the company will take military money to work on. They include “cybersecurity, training, military recruitment, veterans’ healthcare, and search and rescue” — and not SSE or OSINT.

In her own blog post, Google Cloud chief Diane Greene, makes two very different points. First, the company is not bailing on the Maven contract, but is merely not renewing it. Second, they are still open to doing work for the military. “We will continue to work with government organizations on cybersecurity, productivity tools, healthcare, and other forms of cloud initiatives.”

The addition of “productivity tools” is key. Would such tools include machine learning to help special operators in combat sort through data on seized laptops and scour open sources to develop intelligence on specific people — people who may be targeted for drone strikes?

Defense One asked Google spokespeople about this; they pointed to Greene’s blog post and declined to respond further.