AI Researcher Warns Skynet Killer-Robots "Easier To Achieve Than Self-Driving Cars"

A group of the world’s leading AI researchers and humanitarian organizations are warning about lethal autonomous weapons systems, or killer robots, that select and kill targets without human control.

The group alleges killer robots now exist and the bulk of these technological developments are military funded in UK, China, Israel, Russia, and the United States. Although, fully autonomous weapons systems have not yet been deployed on the battlefield, the action by the group to ban lethal autonomous weapons is a preemptive ban before the technology falls into the wrong hands. The group calls on the citizens of the world to contact their representatives and for countries to work together to form international treaties, before it’s too late.

Member List of the Campaign to Stop Killer Robots:

  • Human Rights Watch
  • Article 36
  • International Committee for Robot Arms Control
  • PAX
  • Association for Aid and Relief Japan:
  • Mines Action Canada
  • Nobel Women’s Initiative
  • Pugwash Conferences on Science & World Affairs
  • Seguridad Humana en América Latina y el Caribe (SEHLAC)
  • Women’s International League for Peace and Freedom

Future of Life Institute released a video yesterday titled: Slaughterbots.

The video portrays not to far in the distant future of a military firm unveiling a drone with shaped explosives that can target and kill humans on its own. Further in, the video abruptly changes pace, when bad guys get ahold of the technology and unleash swarms of killer robots onto the streets of Washington, D.C. and various academic institutions.

The video is aggressive and graphic but outlines if the technology was misused it could have severe consequences - such as civilian mass causality events.

Stuart Russell, a world leading AI researcher at the University of California in Berkeley, showed the video above to the United Nations Convention on Conventional Weapons on Monday. He said, “the technology illustrated in the film is simply an integration of existing capabilities. It is not science fiction. In fact, it is easier to achieve than self-driving cars, which require far higher standards of performance.”

Russell wants a preemptive ban on the technology before it’s too late. He claims the window to halt such technologies is closing and warns that autonomous weapons, such as drones, tanks and automated machine guns are imminent.

The Guardian adds,

The military has been one of the largest funders and adopters of artificial intelligence technology.

 

The computing techniques help robots fly, navigate terrain, and patrol territories under the seas. Hooked up to a camera feed, image recognition algorithms can scan video footage for targets better than a human can.

 

An automated sentry that guards South Korea’s border with the North draws on the technology to spot and track targets up to 4km away.  

The International Committee for Robot Arms Control is calling upon the international community for a treaty against autonomous weapon systems.

Given the rapid pace of development of military robotics and the pressing dangers that these pose to peace and international security and to civilians in conflict, we call upon the international community for a legally binding treaty to prohibit the development, testing, production and use of autonomous weapon systems in all circumstances.  

Human Rights Watch is another organization calling for the preventive measures to stop the machines…

 The development of fully autonomous weapons—“killer robots”—that could select and engage targets without human intervention need to be stopped to prevent a future of warfare and policing outside of human control and responsibility.

 

Human Rights Watch investigates these and other problematic weapons systems and works to develop and monitor international standards to protect civilians from armed violence.  

Conclusion: The evolution of targeted killing practices have certainly evolved to something on the lines of the Terminator, a fantasy/science movie where Skynet an artificial intelligence system gained self- awareness, and decided to wage a war on humans. Will Stuart Russell and his team of AI researches be able to stop the trend in autonomous weapon systems before it’s too late?

Comments

DEMIZEN Juggernaut x2 Thu, 11/16/2017 - 08:53 Permalink

you could do lug wrench shaped rpg barrels with 4 fans. two hits. maybe even go hexa for 3 hits. these little 1k dollar wasp drones could saturate very expensive offensive weapons and confuse the fuck out of flares.  war was never a paradise, but least this one looks like it will be over fast for most of us. id rather be killed by a robot than by some really sick sadistic fuck.

In reply to by Juggernaut x2

DEMIZEN Wed, 11/15/2017 - 22:32 Permalink

the scary part is not the fact that they can. with given power supply to arms, system could probably pick targets so fast that muzzle velocity and barrel heat dissipation would be the limiting factors. day or night. probably even faster at night, due to temp difference. fuck it could even go bayesian  and pick target order to minimize servo movements. 

DEMIZEN RationalLuddite Thu, 11/16/2017 - 08:43 Permalink

i mean it.  air/sea/nukes cetebus paribus, it becomes a robo trenchfare, offense dept main problem : electric power mobility and logistics for armor and mobile platforms. project tactical (electric) power => new army when you run your cost/benefit analysis on spoils and warfare costs, youll will decide to stay at home. nukes are probably the best things that happened to modern peace. modern autoguns will double down on it.

In reply to by RationalLuddite

HRClinton Wed, 11/15/2017 - 22:32 Permalink

Re... AI Researcher Warns Skynet Killer-Robots "Easier To Achieve That Self-Driving Cars"Looks like TD is getting as bad as me with spell-checking the damn auto-spell tool.I'm pretty sure it's...AI Researcher Warns Skynet Killer-Robots "Easier To Achieve Than Self-Driving Cars"

Neochrome Wed, 11/15/2017 - 22:36 Permalink

Or...Simple phone app, monitoring for movement and firing a sniper rifle from a hidden location. You don't need an AI to spread terror, if anything AI can help shoot down those drones with laser enforced no-drone-zone.

Vlad the Inhaler Wed, 11/15/2017 - 22:47 Permalink

"unleash swarms of killer robots onto the streets of Washington, D.C. and various academic institutions" is actually not a bad idea.  Maybe we should let them work on these just a little bit longer.

whatswhat1@yahoo.com Wed, 11/15/2017 - 22:54 Permalink

st1\:*{behavior:url(#ieooui) }

/* Style Definitions */
table.MsoNormalTable
{mso-style-name:"Table Normal";
mso-tstyle-rowband-size:0;
mso-tstyle-colband-size:0;
mso-style-noshow:yes;
mso-style-parent:"";
mso-padding-alt:0in 5.4pt 0in 5.4pt;
mso-para-margin:0in;
mso-para-margin-bottom:.0001pt;
mso-pagination:widow-orphan;
font-size:10.0pt;
font-family:"Times New Roman";}
Low tech, indiscriminate slaughter (Las Vegas) is easy, cost effective and available to any psychopath with a decent weapon