Attack of the Slaughterbots

In the United States, where there’s a mass shooting every day, the gun control conversation is buzzing. Do guns kill people? Or do people kill people? Add a President who tweets “fire and fury” insults to North Korea’s leader, and the weapons discussion goes nuclear.

Obviously, the issue goes far beyond America. Relatively unfettered access to weapons of destruction—mass or otherwise—leaves many in constant fear and danger. Al-Qaeda, ISIS, Taliban, Boko Haram, Hezbolllah, and other groups have turned terror into a way of life.

Even the “good guys” are sometimes guilty, intentional or not. Civilian casualties are a regular part of warfare. Innocents are killed in drive-by shootings. And then there’s the “collateral damage” of drone strikes that are rarely as precise as planned.

Speaking of drones . . .

What if we took the “human element” out of all of the above, and simply let smart drones—specifically, “killer bots,” operating on artificial intelligence—do the job?

Would that cut down on the mass shootings? Probably. Eliminate “collateral damage”? Almost certainly. Wipe out terror groups? Possibly. Ensure that only “bad guys” get offed? Well . . .

How do you define “bad guy”? Who does the defining? And who programs the AI that ultimately directs the lethal drones on whom to take out?

Taken a step further, what if a single one of those smart drones could fit in the palm of your hand, and was programmable to kill seek and destroy one individual—almost like a bullet to the head?

Might be good for taking out the likes of Osama bin Laden. But what about someone who shamed you on social media? Or the school bully? Or the creep who sexually assaulted you?

Could such smart drones become the new version of “packing heat”?

The questions aren’t merely theoretical. The technology is already here. Stuart Russell, a professor of computer science at Cal-Berkeley, has put together a chilling video called “Slaughterbots” that has rapidly gone viral.

Watch it, and consider the ethical questions that come to mind. (We found the moments of audience applause especially creepy.)

Assuming you’ve now seen the video, it’s worth adding that Russell presented it at the recent United Nations Convention on Conventional Weapons hosted by the Campaign to Stop Killer Robots.

There’s widespread support for a ban on this technology. More than 200 Canadian scientists and over 100 Australian scientists wrote open letters to Prime Minister Justin Trudeau and Malcolm Turnbull urging them to support the ban. And those follow earlier calls for a ban—including a 2015 open letter signed by more than 20,000 AI/Robotics researchers and others, including Elon Musk and Stephen Hawking.

Read more about it at The Future of Life Institute. And there’s much more here.


Photo and video: Future of Life Institute

 

Icon-O
The editorial staff of ORBITER magazine humbly pursues life's Big Questions, illuminating the human condition and our place in the universe.