Inside the startup that is building AI-powered military technologyBy Blair Morris
February 20, 2020
On a blazing day in the scrublands just outside Irvine, California, Brian Schimpf watched as a man walked into a distant valley wearing a long-sleeved shirt and a hat to protect against the sun.
Within moments, sensors in towers on a nearby hillside used pattern-recognition algorithms to spot the man, and remote cameras found and tracked him. A large helicopter-like drone whirred to life, and flew over to conduct closer surveillance.
Schimpf is the CEO and co-founder of Anduril, a startup that is building surveillance and defense systems for the U.S. military and other agencies. The man being followed by these sensors was an employee, he explained, demonstrating the ability of this system to find and track a human intruder over a wide area with almost no human input.
“The system detects there’s motion, pans a camera over to it and uses computer-vision algorithms to determine, ‘Am I looking at a person, a cow, a car?’” he said of the system, which needed only a single technician to operate.
The technology that governs all of this is a software platform, powered by artificial intelligence, called Lattice. Anduril markets the system as a way of monitoring installations, military bases and borders.
Anduril’s founding mission is to give military and government personnel technology-based capabilities with the help of AI that would allow a single person to keep watch over hundreds of miles of terrain.
At the moment, the intruder-detection system just spots the movement of walking legs — it doesn’t determine whether a person is authorized to be in the area, or whether there’s a weapon present. But Anduril’s other co-founder, Palmer Luckey, said he envisions a future in which the U.S. military can someday deploy a system like Lattice anywhere for a variety of missions including battlefield awareness and threat assessment in urban environments.
“What I really want is surveillance that you can deploy on demand to a specific area for a specific need and then pull out,” Luckey said. “I want to be able to say, ‘An operation’s about to happen right here. We need to soak that area with sensors from aerial vehicles, ground vehicles.’”
Anduril was founded in 2017, and has already signed contracts with several branches of the U.S. government. Anduril won’t release a complete list, but a company spokesperson says that it has contracts with roughly a dozen agencies of the Department of Defense and the Department of Homeland Security.
For more on this story, watch NBC News NOW at 3 p.m. ET/2 p.m. CT.
According to a contract obtained by a Freedom of Information Act request filed by the Latinx advocacy group Mijente, the Marine Corps has paid $13.5 million to install Anduril systems at military bases in Japan and the United States, including one that abuts the U.S.-Mexico border. Customs and Border Protection has tested Anduril’s system along a stretch of California’s border with Mexico near San Diego and detected a reported 55 unauthorized migrants attempting to cross, according to Wired magazine. The U.K.’s Royal Marines also have a contract with Anduril, according to the company.
Anduril is also moving beyond surveillance. Schimpf later demonstrated a new capability: detecting and destroying drones using high-powered “interceptor” drones of its own.
At Schimpf’s command, a technician fired up an off-the-shelf white quadcopter and brought it to a hover about 100 feet off the ground. Then Anduril’s interceptor, roughly the weight of a bowling ball, whizzed upward at the white drone, smashed into it and landed undamaged, as the white drone fell to the ground in pieces.
The company recently signed a military contract to deploy these interceptor drones overseas in conflict zones. As drones have become cheaper and easier to buy, they’ve also become a greater threat, used by Islamic State militants, among others, to drop bombs and conduct surveillance. And in December 2018 Gatwick Airport was forced to close and ground hundreds of flights after a drone was sighted near its runways, one of a growing number of such incidents at airports around the world.
The company and its founders are unapologetic about its mission, making it an outlier in the U.S. technology industry. Militarization of technology has recently become a sensitive subject at the world’s largest tech firms. Employees at several major companies, including Amazon, Microsoft and Google, have privately and publicly protested the militarization of the technology they’re building.
Anduril is different. Its coders and engineers are openly interested in providing surveillance systems to the U.S. military. In an interview at Anduril’s new headquarters in Irvine, Luckey, a former Facebook executive, detailed why he founded the company, and why he thinks much of Silicon Valley is wrong not to help the U.S. government.
“The United States needs to be focusing on the technologies that are going to win the next wars, not the ones that won the last wars,” Luckey said. “And the technology companies that should be solving these problems refuse to do so.”
The new arms race
Luckey, 27, is among the more polarizing figures of the tech industry. After starting out building high-end gaming systems, Luckey went on to launch a virtual reality company called Oculus, which was acquired by Facebook in March of 2014 for more than $2 billion.
But Luckey was ousted from Facebook in 2017 after the company lost a $500 million intellectual property lawsuit on Oculus’s behalf. Luckey’s politics also became part of the story when The Wall Street Journal reported the departure may have had to do with Luckey donating to an anti-Hillary Clinton group in the run-up to the 2016 election.
“It wasn’t my choice to leave,” he told CNBC’s Andrew Ross Sorkin in October 2018 at an event in Los Angeles. “I gave $10,000 to a pro-Trump group, and I think that’s something to do with it,” he also told CNBC’s Deirdre Bosa.
But Luckey kept a key Facebook figure in his corner: Peter Thiel, a member of the board of directors of Facebook, who was an early investor in Oculus. Thiel is also one of the few outspoken supporters of President Donald Trump in the tech industry.
Luckey announced plans for Anduril, named for a sword called “the flame of the West” in J.R.R. Tolkien’s “The Lord of the Rings,” shortly after his departure from Facebook. Founders Fund, a venture fund led in part by Thiel, was among its earliest investors. That same fund helped launch Palantir, another surveillance-technology company that has contracts with the military and the U.S. government. Several of Anduril’s executives, including Schimpf, came to Anduril from Palantir.
Thiel has argued that tech companies have some patriotic duty to work with the U.S. government, and not with its rivals. In a New York Times op-ed article, Thiel called AI “a military technology” and criticized Google for “starting an A.I. lab in China while ending an A.I. contract with the Pentagon.”
Luckey appears to share a similar worldview, stressing that China’s AI development — and its willingness to sell its technology to countries around the world — is a new arms race that the U.S. is losing.
“In the same way that the Soviets gave away boxes of AK-47s to other countries to get in bed with them, China is giving countries in Africa and Asia access to artificial intelligence technology that allows them to build totalitarian police states,” Luckey said. “And they do this because it makes these countries completely dependent on China.”
Some technologists disagree.
“I think it is concerning that we are seeing an ever closer relationship between the world’s most powerful military and the tech industry,” said Meredith Whittaker, a former Google employee who worked on AI.
Whittaker also helped organize the internal resistance to Google’s work on Project Maven, a Pentagon contract for AI systems that Google decided not to renew after a roughly 4,000-employee petition circulated inside the company. She now co-directs the AI Now Institute at New York University, which studies the social implications of artificial intelligence. She said that some Google co-workers didn’t want AI to be used in matters of imprisonment and freedom or life and death, because the technology isn’t reliable enough and there’s almost no public oversight.
“These are the people who are extremely close to the specifics of these technologies, who know full well how brittle these systems can be, how inaccurate they can be,” Whittaker said.
She argued that it’s a lack of public information on the use of technology within government agencies that forced her and her colleagues to take a stand.
“There is so little accountability around these relationships,” Whittaker said. “On the tech company side you have practices and details that are hidden behind trade secrecy. And on the military side, you have practices and details that are hidden behind classification protocols. In the middle there is very, very little room for democratic deliberation.”
As a startup purpose-built for government and military work, there is no ethical debate within Anduril over the militarization of technology.
The company writes code and fabricates sensors and drones in a 155,000-square-foot building traversed by engineers on skateboards and hoverboards. The company recently closed a new round of investment led by Founders Fund and General Catalyst, with Andreesen Horowitz, 8VC and Lux Capital also involved. Anduril says it is now valued at nearly $1 billion.
The idea that U.S. technology companies bear some responsibility to sell to the government has gained traction among tech executives. Microsoft’s chief executive, Satya Nadella, told CNN Business that he and his company had “made a principled decision that we’re not going to withhold technology from institutions that we have elected in democracies to protect the freedoms we enjoy.” Amazon CEO Jeff Bezos told the Wired 25 Conference in October 2018 that “if big tech companies are going to turn their back on U.S. Department of Defense, this country is going to be in trouble.”
Asked whether there was any unethical use to which his technology might be put that would cause him to pull it off the market, Luckey said he couldn’t think of an example. “There are things that I don’t want it to be used for,” he said, but he went on to say he trusts military and government agencies to obey their own ethical standards.
“I’m not that concerned because I think the United States does have a really good reflex on these types of things,” he said.
Luckey pointed out that Anduril’s border system doesn’t use facial recognition or other biometrics to specifically identify individuals or store their identity, although he also admitted that there was nothing intrinsic in what he had built that would keep a military or government client from feeding images captured by his system through its own facial recognition system.
“There’s nothing you could do to stop people from combining things,” he said.
But Luckey said that he doesn’t feel that his company should be in the business of denying its work to any particular government agency based on political or ethical beliefs.
“We should be voting in people who are going to do the right things,” Luckey said. “But what you don’t want to do is say, ‘I’m afraid my technology might be used some day for something that might be unethical. And therefore I’m going to deprive the armed forces of the technology wholesale so they can’t use it in any case.’”
And Luckey also warned that failing to develop responsible technologies leaves the door open for governments around the world to fall back on technology from other countries that may not be as ethically sound.
In the end, he said he doesn’t believe it’s his job to create new ethical standards to go along with new technology.
“I don’t think I’m the guy to teach people ethics. I can give people my perspective,” he said. “But I think fundamentally, this comes back to me being an optimist about the American system.”