Please note that by playing this clip YouTube and Google will place a long-term cookie on your computer.
I'm going on WBUR's Radio Boston today to discuss the coming drone invasion, and efforts to ensure that police in Massachusetts cannot use them to spy on people sunbathing in their backyards, or protesting at political rallies. You can learn all about the Drone Privacy Act here.
While doing some research for today's segment, I came across the video embedded above, which tells the story of how a few researchers built a swarm of autonomous drones, using relatively cheap and commercially available materials. The drone swarm flies on its own, without any input from human beings on the ground.
Nature magazine explains:
The aircraft, called quadcopters because they have four rotors, navigate using signals from Global Positioning System (GPS) receivers, communicate their positions to one another via radio and compute their own flight plans. They were created by a team of scientists led by Tamás Vicsek, a physicist at Eötvös Loránd University in Budapest.
“This is remarkable work,” says Iain Couzin, who studies collective animal behaviour at Princeton University in New Jersey. “It is the first outdoor demonstration of how biologically inspired rules can be used to create resilient yet dynamic flocks. [It suggests] we will be able to achieve large, coordinated robot flocks much sooner than many would have anticipated.”
In the video, one of the researchers describes how their swarm could be used to deliver food, monitor environmental conditions, and enhance agricultural productivity. He says that drones, like any other technology, can be used for good or evil, and that it's up to us to decide how to put them to work.
While these autonomous swarms may indeed be put to good use, I can also imagine killer robots swarming around communicating only with each other, using NSA metadata to select targets to kill, and then blowing them up. Already, over 70 countries are working on weaponized autonomous killer robot systems. Apparently they are trying to solve an age old problem that military leaders have faced since the dawn of time: morality. Drone pilots in the United States have already reported experiencing post-traumatic stress disorder as a result of their work. Handing over drone assassinations to autonomous robots would take care of that problem.
But that's not a good thing. Apart from the obvious ethical problems inherent to making it easier to kill people, allowing autonomous robots to kill independent of human decision-making is illegal under international law. And common sense says it's a terrible idea. As Human Rights Watch's Steve Goose said, "Giving machines the power to decide who lives and dies on the battlefield would take technology too far." Ya think?
Incredible that something like this even needs to be said. Welcome to the 21st century, where corporations and governments can't secure their own data systems or computers, but think building autonomous killer robots is a great idea.