Elon Musk’s previous concerns about artificial intelligence have apparently solidified.

Elon Musk compares AI to Nukes

In tweets this weekend Musk, the visionary technical leader of Tesla Motors Inc (NASDAQ:TSLA), SpaceX and SolarCity Corp (NASDAQ:SCTY), and the co-founder of PayPal, said artificial intelligence is “potentially more dangerous than nukes.”

Elon Musk Tesla Motors

This isn’t the first time Musk has expressed concerns.  As previously reported in ValueWalk, Musk raised issues about self-learning computers in June of this year.

It is interesting someone deep inside the world of technology is sounding the alarm as a culture comes to treat with a yawn NSA spying – and the fact the CIA is abusing this technology to spy on the US Senate.  Tying the CIA spying issue in with the Edward Snowden issue doesn’t get much mainstream media discussion, just like the dangers of controlling technology in a growing “internet of everything” world tend not to get much – if any – mainstream media discussion.

But here is Musk, going against the mainstream control and taking a pausing minute to reflect on the situation that is similar to the reflection that should take place in a romantic relationship before being blinded by the toxin that is love in a head over heels human relationship.

Before society goes down the isle of the “internet of everything” and becomes bound in a controlling relationship where artificial intelligence can, without a conscious or any human morality whatsoever, coldly make decisions of critical importance to society.

Already computers have taken over trading and the result is documented to have been more flash crashes in history (most of which are not identified or accounted for, some say by design).

Drone tech

Look at the latest drone technology plans, the next steps in artificial intelligence, to see that it could be the drone making the decision to fire upon and kill a human in the near future. Based on a pre-determined set of variables, a drone could murder a human in machine-like fashion, to be sure.

Would their be criminal justice for the killer drone?  Not likely if you consider how computers currently receive a more liberal legal standard than do humans (this argument can be made by considering high frequency trading and other cases of human vs computer crimes).

Also at issue are projections made by academics that in the not too distant future computer applications could be implanted in the human brain to make them “more efficient.” Computers are already being used to control animal brains in drone experiments, so why not connect the dots and just ponder the potential for a computer to control a human brain?

Perhaps this is why the headlong dive into a serious relationship with technology controlling every aspect of life might be like a bad marriage: only after the bond is difficult to break do you discover the controlling nature of your partner.

The key to any relationship is knowing what you’re getting into from the start.  If society is aware of potential problems, then at least if it moves in that direction surprises won’t  and issues can be addressed.

Musk didn’t detail his thoughts, unfortunately, but he no doubt is a serious thinker on the issue.  His thoughts should be documented, as his second tweet on the topic only begged to have his full concerns outlined:

“Hope we’re not just the biological boot loader for digital superintelligence. Unfortunately, that is increasingly probable.”