Elon Musk, Google’s DeepMind co-founders promise never to make killer robots


Musk has been particularly vocal about the potential dangers of artificial intelligence. For example: “Mark my words, AI is far more dangerous than nukes,” Musk said at the South by Southwest tech conference in Austin, Texas, in March.

DeepMind is an artificial intelligence company which was founded in London in 2010 and acquired by Google in 2014. All three of the company’s co-founders — Demis Hassabis, Shane Legg, and Mustafa Suleyman — are signatories on the pledge.

Google employees have recently petitioned the company’s management to extricate itself from a contract with the United States Department of Defense called Project Maven. The partnership involved Google developing artificial intelligence surveillance to help the military analyze video footage captured by U.S. government drones. “We believe that Google should not be in the business of war,” Google employees wrote in a letter to their boss, CEO Sundar Pichai. In June, Google Cloud chief Diane Greene told employees the company would not renew its contractwith the Department of Defense after it expires in March 2019.

For Wednesday’s pledge published by the Future of Life organization, “lethal autonomous weapons” are defined as those that can “identify, target, and kill a person, without a human ‘in-the-loop,’” according to the nonprofit’s written statement. “That is, no person makes the final decision to authorize lethal force: the decision and authorization about whether or not someone will die is left to the autonomous weapons system,” the statement says.

Meanwhile, “today’s drones” are not included in the Future of Life’s definition of lethal autonomous weapons because drones “are under human control,” the statement says. Further, autonomous machines that defend against other weapons are not included either, the statement says.

“AI has huge potential to help the world — if we stigmatize and prevent its abuse. AI weapons that autonomously decide to kill people are as disgusting and destabilizing as bioweapons, and should be dealt with in the same way,” Max Tegmark, president of the Future of Life Institute and MIT physics professor, said in a statement announcing the pledge.

Leave A Reply

Your email address will not be published.