Elon Musk issued a dire warning about artificial intelligence this week, saying that it is a threat to the public and is quickly leading to the singularity, a point in which technology is in control and humans are not.
Musk says that this future would be like the movie Terminator only, he says, it won’t happen exactly like that.
Musk said that Google’s founder Larry Page wants technology to act as a God and that when Musk questioned this, Page called him a “speciest.” Meaning someone who favors human survival over other species’ survival. Isn’t that human nature?
When you compare this to the letter issued last month by Bill Gates, it gives us two polar perspectives. One man (Gates) thinks that AI could eventually decide humans are a threat and work against us but that the benefits outweigh the risks, so, onward! The other man (Musk) thinks that the government must regulate AI before this happens.
What do you think? Should we think this through before we no longer have the option?