Elon Musk's Worst Nightmare: Russian AK-47 Maker Builds Fully-Automated "Killer Robot"

Tyler Durden's picture

Authored by Joseph Jankowski via PlanetFreeWill.com,

The debate over the role robots will play in the future of warfare is one that is taking place right now as the development of automated lethal technology is truly beginning to take shape. Predator drone style combat machines are just the tip of the iceberg for what is to come down the line of lethal weaponry and some are worried that when robots are calling the shots, things could get a little out of hand.

Recently there has been some debate at the U.N. about “killer robots,” with prominent scientists, researchers, and Human rights organizations all warning that this type of technology – lethal tech. that divorces the need for human control – could cause a slew of unintended consequence to the detriment of humanity.

A study conducted the University of British Columbia shows that this type of terminator-like weaponry isn’t sitting well with the general public, as an overwhelming majority of people, regardless of country or culture, want a complete ban placed upon any further development of these autonomous systems of war.

Despite the warnings of risk and concern, this is not stopping arms manufacturers from taking warfare into the twilight zone and bringing the futuristic battlefield scenario where A.I. robots and human are fighting with each other, side by side, closer to everyday reality.

Kalashnikov, the maker of the iconic AK-47, is one of those manufacturers bringing lethal automation and robotics into the present-day as it is currently building a range of products based on neural networks,’ including a fully automated combat module’ that can identify and shoot at its targets.

Defense One is reporting:

The maker of the famous AK-47 rifle is building “a range of products based on neural networks,” including a “fully automated combat module” that can identify and shoot at its targets. That’s what Kalashnikov spokeswoman Sofiya Ivanova told TASS, a Russian government information agency last week. It’s the latest illustration of how the U.S. and Russia differ as they develop artificial intelligence and robotics for warfare.

 

The Kalashnikov “combat module” will consist of a gun connected to a console that constantly crunches image data “to identify targets and make decisions,” Ivanova told TASS. A Kalashnikov photo that ran with the TASS piece showed a turret-mounted weapon that appeared to fire rounds of 25mm or so.

Defense One points out that in 2012 then-Deputy Defense Secretary Ash Carter signed a directive forbidding the U.S. to allow any robot or machine to take lethal action without the supervision of a human operator.

Then in 2015, then-Deputy Defense Secretary Bob Work said fully automated killing machines were un-American.

“I will make a hypothesis: that authoritarian regimes who believe people are weaknesses,” Work said, “that they cannot be trusted, they will naturally gravitate toward totally automated solutions. Why do I know that? Because that is exactly the way the Soviets conceived of their reconnaissance strike complex. It was going to be completely automated. We believe that the advantage we have as we start this competition is our people.”

According to Sergey Denisentsev, a visiting fellow at the Center For Strategic International Studies, Russian weapons makers see robotics and the artificial intelligence driving them as key to future sales to war makers.

“There is a need to look for new market niches such as electronic warfare systems, small submarines, and robots, but that will require strong promotional effort because a new technology sometimes finds it hard to find a buyer and to convince the buyer that he really needs it, ” Denisentsev said earlier this year.

With my previous reporting dealing with robotics and war, I always point out the incredible advances made by Softbank owned Boston Dynamics in the field of A.I., using it as an example of what future warfare could (or most likely will) look like it. And to be honest, it really is nightmarish.

The bottom line is war is a racket. Killing for political reasons is always disastrous. So the fact that governments are on the verge of possessing this terminator technology should send chills down everyone’s spine.

H/T Nicholas West of ActivistPost.com

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.
LetThemEatRand's picture

Sociopaths are the worst kind of people, because they have no soul.  Musk speaks from the heart on this one because he knows.

Soul Glow's picture

First of all you are wrong, becuae Psychopaths are worse than sociopaths.  Second of all you are wrong, because Musk is definately a sociopath and that makes him worse than everyone other than psychopaths.

LetThemEatRand's picture

+1.  Good points all around.

Wood_Vlogs's picture

I'm making over $7k a month working part time. I kept hearing other people tell me how much money they can make online so I decided to look into it. Well, it was all true and has totally changed my life. This is what I do. www.jobproplan.com

Soul Glow's picture

I met my girlfriend on a dating site.  Maybe give that a try since you obviously need to get laid.

BaBaBouy's picture

SuperRobots... Powered by L-Ion Superbatteries...

 

Elon knows the potential...

Dormouse's picture

Maybe if he flaps his arms, squalling and raving about SkyNet and CyberDyn, people won't notice that the wealthiest welfare recipient in the country makes cars that kill idiots, that he's literally undermining Los Angeles, or that his alleged reusable rockets can only land if you play the launch footage in reverse.

SNAKE OIL HERE! Get your snake oil. Good for all I say will ail you.

MillionDollarButter's picture

Nuclear winter training ground in video.

Mr 9x19's picture

i will be radical in my words : when world see usa troops all over the world making mass killing in the name of freedom and democracy, nobody budge.

when the world see a robot with a barrel on it, it is always skynet and terminator reference.

but the reel problem is not the image, the reflect of the soul of any living form, the problem is much simple, a human accuracy is like 0-70%

a computer based aiming solution would be above 98%

when a program apply a mp3 tag electronic on classical song, it only affect a human seeing there is a bad information, and will rectify, a program who read, just display what is in the tag file.

you cannot ask mercy to a program.

in 20 years with 2Nd generation of people born with smartphone in the ass, as they will be totally deshumanized in social relationship, for them having a bot as cop instead of a humain will not be a problem anymore.

 

HenryKissingerBilderberg's picture

If there is good demand Tesla will be building terminators of course...

(at loss, sponsored by the US)

Manthong's picture

 

I have over 100 linear feet adjacent to a golf course.

My landscape treatment will not be complete until I put one of those in the yard.

The next f''head who throws his smelly cigar over the fence gets it full auto..

 

HenryKissingerBilderberg's picture

My landscape treatment will not be complete until I put one of those in the yard.
The next f''head who throws his smelly cigar over the fence gets it full auto..

first make shure it cannot be hacked to kill you (tip: it can)

HenryKissingerBilderberg's picture

this toys are always build by kids' startups with cheap of the shelf parts and cheap of the shelf security

hence EVERYONE and I mean EVERYONE can hack this toys...

 

OregonGrown's picture

If these things were using lithium ion batteries that Musk sells...... I bet he would be humminga different tune!

 

GoinFawr's picture

"The next f''head who throws his smelly cigar over the fence gets it full auto.."

That's a bit strong, I mean if the golf course was already there when you purchased the property: wtf did you expect?

HowdyDoody's picture

Already done.

Meet the Terminator 2 - already in action in Syria.

https://www.youtube.com/watch?v=hj_Ul4PTIws

Sirius Wonderblast's picture

So we are reminding that hiding behind a vehicle only works in Hollywood.

GoinFawr's picture

If that's from T2 it needs to morph its liquid metal into a more subtle configuration...

https://www.youtube.com/watch?v=Aq5ydeWWr4A

UnpatrioticHoarder's picture

I've worked on robotic weapon systems. The key to automation is 100% reliability. Without a human in the loop software bugs are not merely bad, they are catastrophic. That is probably the main reason, not ethics

redmudhooch's picture

No its not, see my post below.

They intend to kill us with robots.

Simple as that, humans have morals and feelings. (some of us anyway)

GoinFawr's picture

"Without a human in the loop software bugs are not merely bad, they are catastrophic"

The MCP has come a long way since "Tron", and it still hasn't managed to learn how to blackmail...

https://www.youtube.com/watch?v=5jUImuByIgA

Boscovius's picture

So what you are saying is that when these things come out shooting at us they will be playing Wagner's Flight of the Valkyries?

redmudhooch's picture

I think the real danger is that you are removing real soldiers, that have human feelings.
While a typical soldier from somewhere like Texas, given the command to go to war here in America, and kill fellow Texans, might pause and thinks that that is just wrong, why would I kill fellow Texans for this asshole leader.
Robots can be 100% controlled by our "leaders", they have no morals or feelings. So killing Texans wouldn't require the help of a Texan.
This is what they intend to do with robots.

flapdoodle's picture

Not to worry about American robotic killing machines. They will come programmed to only kill Goyim.

SmokeOrMirrors's picture

" the problem is much simple, a human accuracy is like 0-70%

a computer based aiming solution would be above 98%"

I  agree, and even with a so-called "human in the loop"  the lines between human and machine based decision making are going to get blurry.   I can envision a simple interface where an operator selects dozens of targets with a few simple mouseclicks and then gives the order to execute with the AI taking off on its own to rampage on the battlefield.  The operator then goes out for a beer with his buddies while war is being waged on his behalf.    Yes,  technically a human is making the decision,  but its a lot more automated than I would be comfortable with...

 

Cynicles II's picture

In an article (comments) about bots, you advise a bot to visit an automated wesite to get laid.

Irony can be good...

VladLenin's picture

Run for political office or start a church if you want to make life changing money.

Buckaroo Banzai's picture

"Then in 2015, then-Deputy Defense Secretary Bob Work said fully automated killing machines were un-American."

When you consider that US combat personnel are 95% white, and largely from the former Confederate States, why am I not surprised that a senior Obama administration defense department official would rather send white American soldiers out to die in sand-nigger-infested desert shitholes, and not machines in their place?

It would be fun to watch a US Marine from, say, Arkansas, or maybe Mississippi, beat the living shit out of Bob Work with just his two fists. I'm sure Bob would be happy that the machines were entirely left out of the equation

Cynicles II's picture

Read "War Is A Racket" on a flight down to  LAX today..

 

ya know, the more things change the more they stay the same

Jumbie's picture

USAF is 26% minority.
http://download.militaryonesource.mil/12038/MOS/Reports/2015-Demographic...

And as if the sociopathic joystick heroes in Nevada aren't simply programmed AI for the Predator...

roadhazard's picture

You two should get a room.

TheReplacement's picture

Get a room.  Perhaps with a fully automated sex machine.

Luc X. Ifer's picture

Then logically you are a self acknowledged psychopath because you have no soul.

meditate_vigorously's picture

I would argue psychopaths are better, because they are easier to identify, therefore would do less harm over the long run.

Soul Glow's picture

Psychopaths learn emotions yet don't feel them and don't care if they are liked.  Sociopaths learn the same tendancy yet care if they are liked.  Be careful who your allies are.

HRClinton's picture

You don't know what you're talking about, when it comes to psychos and sociopaths. 

A Psycho targets individuals, and may kill dozens before they're finally caught. E.g. Hannibal Lecter.

A Sociopath will have thousands or millions killed, to achieve their goal of dominance and victory. Moses, Nero, the Rothschilds, Stalin, Mao, the Bushes and Clintons are examples. 

LetThemEatRand's picture

His heart is in the right place, and you're both right in all that matters.  

EddieLomax's picture

This is one of the few things I'd trust Elon on, its the height of stupidity taking humans out of the equation regarding fighting.

It also exposes us to the flaws in artificial intelligence, both as being on the receving end, and also putting too much reliance in AI to defend us.

As humans we have 100 billion neuron brains connected by trillions of synapsys, in the long run I'd trust a human more than a machine.  I can see the first nation to put too much faith in machines meeting a grisely end.

shovelhead's picture

"Oh yeah? Well my machine can beat up your machine."

Why would we need autonomous AI killer war machines when humans have been doing such a spendid job doing it on our own?

Are we afraid the machines might be too efficient at being stupid?

Soul Glow's picture

Musk's worst nightmare - crashing a McLaren.....oh wait.

Amalgamated Tang's picture

Let the thing "smell" one of Hillary's scarves and then accidentally walk out of the warehouse without turning it off.

MsCreant's picture

It would puke oil, roll over and die.

stant's picture

Then be used as parts for a wisky still

Buckaroo Banzai's picture

Then just send a Marine in to finish the job.

runswithscissors's picture

The evil evil evil evil evil evil Russians again!

HisNameIsRP's picture

It's an RC car, big fucking deal.

techpriest's picture

This tech has been around since at least the Iraq War. Kids were building entries to a killer robot competition put on by (IIRC) the Air Force, and they were testing them in the soccer field at my alma mater. That was 2004 - they were coming up with a drone that could autonomously fly to a target, then follow someone into a building and detonate. Also, search for "paintball sentry" on Youtube. The tech is already open source - the article version is simply an armored up version with a real gun and better hardware/software.

Like it or not, remote/automated killing is going to go far beyond mines if WW3 rolls around. and given how much info is released to the public, who knows how its going to used? It is better not to push for war.