AI Researchers Boycott South Korean University Over Plan To Build "Killer Robots"

It looks like Tesla CEO Elon Musk and Russian President Vladimir Putin aren't the only ones who've envisioned a nightmare scenario where "killer robots" stalk through neighborhoods murdering innocent Americans (or Russians).

A group of artificial intelligence researchers from nearly 30 countries is boycotting one of South Korea's most prestigious universities over concerns about a recent partnership with an "ethically dubious" arms manufacturer with the stated purpose to design and manufacture "autonomous weapons systems".

The Korea Advanced Institute of Science and Technology (KAIST) and its partner, the weapons manufacturer Hanwha Systems, one of South Korea's largest arms dealers, are pushing back against the boycott, saying they have no intention of developing "killer robots" - even though the description of the project clearly states its goals, per the Guardian.

"There are plenty of great things you can do with AI that save lives, including in a military context, but to openly declare the goal is to develop autonomous weapons and have a partner like this sparks huge concern," said Toby Walsh, the organiser of the boycott and a professor at the University of New South Wales.

"This is a very respected university partnering with a very ethically dubious partner that continues to violate international norms."

What's worse, the scientists say, is Hanwha's history of manufacturing and selling cluster munitions and other arms that are banned in more than 120 countries under an international treaty that South Korea, the US, Russia and China have not signed.

Killer

Walsh, an Australian professor, became aware of the project after reading a Korea Times article about the partnership. He said he promptly wrote the university asking for more information - but never received a response.

Walsh was initially concerned when a Korea Times article described KAIST as "joining the global competition to develop autonomous arms" and promptly wrote to the university asking questions but did not receive a response.

Participants in the boycott have promised not to visit KAIST or host or collaborate with any of its faculty "over fears it could accelerate the arms race to develop autonomous weapons."

KAIST opened the controversial research center on Feb. 20. At the time, university leaders said it would "provide a strong foundation for developing national defense technology."

The announcement of the initiative, which has since been deleted, said it would focus on "AI-based command and decision systems, composite navigation algorithms for mega-scale unmanned undersea vehicles, AI-based smart aircraft training systems, and AI-based smart object tracking and recognition technology."

However, for all their effort, it appears the boycotters are already too late to prevent the creation of killer robots, though the group is still agitating for governments to promise to ban the manufacture, use and distribution of these weapons.

South Korea’s Dodaam Systems already manufactures a fully autonomous "combat robot", a stationary turret, capable of detecting targets up to 3km away. Customers include the United Arab Emirates and Qatar and it has been tested on the highly militarised border with North Korea, but company executives told the BBC in 2015 there were "self-imposed restrictions" that required a human to deliver a lethal attack.

The Taranis military drone built by the UK’s BAE Systems can technically operate entirely autonomously, according to Walsh, who said killer robots made everyone less safe, even in a dangerous neighbourhood.

"Developing autonomous weapons would make the security situation on the Korean peninsula worse, not better," he said.

"If these weapons get made anywhere, eventually they would certainly turn up in North Korea and they would have no qualms about using them against the South."

The idea that governments should do more to prevent, or at least regulate, increasingly advanced smart weapons is gaining traction around the world. Last year, Elon Musk surprised his twitter followers by conjuring up an image of robots walking down streets murdering people. While Putin once jokingly mused "how long until the robots eat us?"

Comments

NiggaPleeze ebworthen Fri, 04/06/2018 - 23:53 Permalink

Facial and biometric recognition with data tied through social media, internet access, artificial intelligence, being able to hunt down your phone (i.e., you).

The drones don't really exist yet - it's a "warning" video, but that technology is a few years off at most (all the technology exists, but more miniaturization and cost reduction is needed, but that follows Moore's law, half as big and half as expensive every two years with electronics.

https://www.youtube.com/watch?v=TlO2gcs1YvM

Once the oligarchs have it, kiss your last bit of freedom goodbye.  They'll take your guns, cash, dignity, whatever they want, they'll be unstoppable.

That's why the fact that we have Bolshevik mass-murdering war criminals in the seat of power in DC actually matters a hell of a lot.

In reply to by ebworthen

BigJim boattrash Sat, 04/07/2018 - 08:46 Permalink

If you destroy a killer drone - which will be a little 5-inch long octocopter with an explosive charge - the hive mind (which exists across the other 500 of them patrolling your neighbourhood 24/7) perceives you as a threat and dispatches two to take you out. You manage to destroy those, they then send 4. Destroy those, they send 8, rinse, repeat, until you're just a collection of bloody shards.

After a while, just raise your weapon in the direction of the swarm and you'll be perceived as a threat. After a year of this, just having a weapon is a sign you're resistant to their authority (ie, a threat). After a while, just scowling when one of them flies overhead will be enough to get you marked.

Better have your best "yes, master!" shit-eating grin ready at all times, citizen. Inside or out, cuz you better believe every room in your house will have cameras covering all the angles.

I give it 15 years.

In reply to by boattrash

NiggaPleeze Slipstream Sat, 04/07/2018 - 10:13 Permalink

Because they have limited battery life the killer drones would be charging most of the time (likely solar), sitting on a tree, waiting to pounce.

Surveillance would be done by FB, Google, street/building camera surveillance, your own "smartphone", plane drones in the skies, balloon drones higher in the sky, satellites, etc.  Visual and audio plus infrared.  Full spectral surveillance.

China already has over 200 million cameras deployed, it is expected in two years they will have 1 billion.  All doing facial recognition and AI behavior detection.  Already in China they are issuing tickets to citizens for rules violations. 

Nope, the future is incredibly bleak.  That's what cheering on the PATRIOT ACT, mass NSA surveillance, etc. in the name of "security" gets you:  utter insecurity.  Because the greatest threat always has been, and always will be, the most concentrated power, and that lies in the government, the seat of organized violence and control.

In reply to by Slipstream

AGuy NiggaPleeze Sat, 04/07/2018 - 14:52 Permalink

"Because they have limited battery life the killer drones would be charging most of the time (likely solar), sitting on a tree, waiting to pounce."

I believe SK Military bots are not flying drones, but ground based systems that secure fixed locations or an area. SK already had deployed autonomous weapons about a decade ago.

Realistic, I don't expect Quadcopters to service in combat roles (except perhaps short distance surveillance). They are too limited. I would expect ground based bots like Boston Dynamics designs or rocket propelled with semi-autonomous target guidance to be used. Aerobots with medium to long range would be used for naval and fixed locations that have defense systems which a swarm of Aerobots would overwhelm defensive systems.

In reply to by NiggaPleeze

I am Groot Fri, 04/06/2018 - 22:46 Permalink

I'd rather have a "Fembot" like in Austin Powers. Machine gun jubilees and cater to all my sexual needs. If it can vacuum and clean like Rosie, I'll take two please.

Let it Go I am Groot Sat, 04/07/2018 - 09:38 Permalink

It is not all that difficult to imagine a world where a sex-bot could make our human spouses obsolete, someone had to say it. This is, however, a door that swings both ways. Women should not get too smug about their place in society, and neither should men.

Many women don't think men are all that they are cracked up to be or show all the qualities they might desire in a mate. More than one woman has said the words, "I don't know why I put up with him" and often more than once.  The provocative article below makes the case for replacing your spouse with a better model.

 http://Sex-bots Could Make Your Human Spouse Obsolete.html

In reply to by I am Groot

Laughing.Man Sat, 04/07/2018 - 00:35 Permalink

A day late, a dollar short.  South Korean isn't the only nation that is developing AI killer robots.  Instead of boycotting, just abandon and destroy all research data.  But we all know it's already too late to do anything about it.  I'm expecting a big jump in cybernetics, like in Star Trek The Borg in the future.  IF that becomes a reality, then we're royally fucked.

dchang0 Sat, 04/07/2018 - 03:00 Permalink

It's funny how these scientists think they can stop this with a boycott.

Bad actors, some of them non-state-actors, are going to do whatever they want, boycott or not.

Mr Perspective Sat, 04/07/2018 - 12:13 Permalink

Why does the lying terrorist media take such glee in this ad hominem  repetition of killer robot narratives?

Think about it. What is the purpose of this article?  Is it supposed to encourage you to take action to stop the development of scary monsters? Oh sure, you can do that....

Face it. Humans, by definition, are AI. What could possibly be worse than the human?  No hu-man made weapon in the world is dangerous without HUMAN AI involvement.  Even this article is a premeditated assault upon your well being by human AI. Imaginary killer robots will never be able to mindfuck you as bad as the terrorist media model of human AI has for your entire lives.