For most of its history, DARPA was a shadow. Eisenhower signed the agency, then called ARPA, into being in 1958, in an America still trying to forget images of mustard and chlorine gas, one still feeling the effects of active nuclear fallout. Stories of inhuman experiments by German and Japanese military regimes were still being uncovered. Rumors were rampant about invisible Soviet subs and obscene biological weapons tests. Science and war were respected individually, but mistrusted in concert.
Additionally, the Cold War was already on, and all things military were under the tightest security lockdown. In its earliest phases, ARPA had many of the technological responsibilities of the modern NSA, and struggled with a newly created NASA for control of the country’s rocketry program. There was a de facto wall of silence surrounding even their most domestic of projects. Even as the agency began relying more heavily on their corporate and academic partnerships, the public and internal consensus remained that ARPA was the agency of murder science.
Perhaps it was the ARPA that first really opened someone’s eyes to the idea of DARPA as a national darling. The agency started to play up its less classified achievements while presenting its civilian investments almost as philanthropy. Any non-military technology would be released proudly and with credit — GPS is a particularly stark example, since DARPA only played a partial role in its development.
BigDog seems like the last time people reacted to a DARPA innovation with trepidation.
DARPA was becoming a place where young scientists could actually imagine spending their lives, a slightly macabre but still fun and exciting place to work. Additionally, and the impact of this can’t be overstated: reality began to slowly catch up to almost a century of science fiction. Somewhere embedded in the collective unconscious is the concept of a robot, a flying car, a killer laser beam, and an exoskeleton. The details may have changed, but for the better part of the twentieth century these and other concepts have been obsessed over by modern culture while remaining unattainable in the lab. Now the folks at DARPA are literally creating exoskeletons. They’re literally making robot helper animals.
Today DARPA is a nerd wonderland agency in a nerd wonderland era. Even the borderline projects are safe for public consumption, these days — as the country chews on surveillance as the issue of the future, DARPA releases the latest in invasive surveillance technology to a flurry of Facebook Likes. Their inventions tend to wind up in the family-friendly “Wow, science!” slot of the nightly news, rather than in and amongst the hard, breaking events. Where in the news cycle might we expect to hear if the Chinese military was releasing techno-dubbed video trailers for their nascent Death Robots? The media fondly refers to DARPA as the Pentagon’s “mad scientists” and who reading this site hasn’t at some point dreamed of being a mad scientist?
That’s really the kernel of this issue, I think. There’s no getting around the fact that the majority of DARPA projects will have at least an air of cool, but that’s no excuse to completely forget their more serious implications. The above video shows the agency’s “Grand Challenge” in robot A.I., a rather brilliant move that has everyone from academics to NASA employees blithely programming the movement, awareness, and interaction software of a real-world Terminator. DARPA has cut their work together into something resembling a montage from BattleBots — it’s ostensibly about grabbing people out of collapsing buildings, but that’s hardly a robust cover story. The effusive enthusiasm of nerdery is trivializing real developments in war, treating actual inventions like ain’t-it-cool science experiments.
The media needs to broadly reevaluate its attitudes toward new military research, and the same certainly goes for the civilian partners who do much of the work. It’s one thing to work as part of an international team to provide a part for, say, the ATLAS experiment. It’s quite another to do the same for the ATLAS robot — but let’s go ahead and post an unboxing video from MIT. Even something as seemingly innocuous as BigDog is meant for deployment on a real battlefield, to influence the course of life-or-death events.
DARPA’s new 1.8 gigapixel drone brings all new abilities to the surveillance state.
Contractors like Boston Dynamics can, ultimately, get away with more than academics. The university has always had a complex relationship with military research, loving their enthusiasm and their funding while remaining at least somewhat leery of their long-term goals. That’s become especially true in the modern era, when most military advances have become about widening the margin of superiority over a fairly distant number two — it’s harder to claim urgent national need when developing Cyber Soldiers to counter regular ones, drones to blow up mountain-dwelling guerrillas half the world away.
I don’t think it’s necessarily wrong to assist in military research. Even leaving aside the fact that killer science can lead to life-saving or enriching side-inventions, better war tech is often more precise, less lethal, and more discriminating. These people need not (necessarily) hide their faces when admitting to work with the military. They should publish their results and defend their choice of backers, if challenged. However, the seemingly thoughtless glee that is beginning to surround such work, the Nickelodeon enthusiasm for Cool Science, is beginning to border on the absurd.
Some derivation of ATLAS’ body and/or mind will someday take to a real battlefield and end a real human life. (Read: Is it time to ban autonomous killer robots before it’s too late?) It will be efficient and deadly in doing so thanks in part to the work of whichever team eventually hoists the DARPA cup under the auspices of inoffensive “rescue” AI. When these creations are recontextualized as the face of modern war, we may look back at this carefree time with something akin to horror.
Or, the more chilling possibility: Maybe we won’t.