Stephen Hawking Warns: Robots, Nuclear War & Aliens May Wipe Out Humanity In Less Than 100 Years

    By Vandita | 18 June 2016
    AnonHQ.com

    Astrophysicist Professor Stephen Hawking has warned that the human civilization is entering the most dangerous 100 years in its history and faces extinction thanks to man-made threats such as artificial intelligence (AI), human aggression, and aliens.

    Astrophysicist Professor Stephen Hawking has warned that the human civilization is entering the most dangerous 100 years in its history and faces extinction thanks to man-made threats such as artificial intelligence (AI), human aggression, and aliens [yes, you read it right].

    While Hawking believes technology can ensure mankind’s survival, he simultaneously warns further developing AI could prove a fatal mistake. In a lengthy Q&A session on Reddit, Hawking explained how AI is humanity’s biggest existential threat:

    The real risk with AI isn’t malice but competence. A super intelligent AI will be extremely good at accomplishing its goals, and if those goals aren’t aligned with ours, we’re in trouble.

    You’re probably not an evil ant-hater who steps on ants out of malice, but if you’re in charge of a hydroelectric green energy project and there’s an anthill in the region to be flooded, too bad for the ants. Let’s not place humanity in the position of those ants.

    If they [artificially intelligent machines] become clever, then we may face an intelligence explosion, as machines develop the ability to engineer themselves to be far more intelligent. That might eventually result in machines whose intelligence exceeds ours by more than ours exceeds that of snails.

    If our machines don’t kill us, we might kill ourselves, predicts the cosmologist. During a tour of London’s Science Museum, warning that a major nuclear war would be the end of human civilization Hawking urged people to be more empathetic given that aggression is the human race’s biggest failing which threatens to destroy the human race.

    The Independent quoted the scientist as saying:

    The human failing I would most like to correct is aggression. It may have had survival advantage in caveman days, to get more food, territory or a partner with whom to reproduce, but now it threatens to destroy us all.

    Intelligence and aggression have the capacity to destroy us, but what does this mean for humanity’s chances of getting destroyed by aliens? Since past few years, Hawking has warned that if an intelligent, more advanced alien civilization exists, it would not be friendly to mistreating, less technologically advanced humans, and would have no problem in conquering and colonizing the planet and eventually wiping out the human race. In April 2010, Hawking noted:

    If aliens ever visit us, I think the outcome would be much as when Christopher Columbus first landed in America, which didn’t turn out very well for the Native Americans. Such advanced aliens would perhaps become nomads, looking to conquer and colonize whatever planets they could reach. If so, it makes sense for them to exploit each new planet for material to build more spaceships so they could move on. Who knows what the limits would be?

    During a media event at the Royal Society in London in July 2015, Hawking voiced his fears again:

    We don’t know much about aliens, but we know about humans. If you look at history, contact between humans and less intelligent organisms have often been disastrous from their point of view, and encounters between civilizations with advanced versus primitive technologies have gone badly for the less advanced. A civilization reading one of our messages could be billions of years ahead of us. If so, they will be vastly more powerful, and may not see us as any more valuable than we see bacteria.

    Stephen Hawking: ‘AI could spell end of the human race’

    Be sure to ‘like’ us on Facebook

    4 COMMENTS

    1. Some very good points here but even if the AI and alien threats do not develop humanity and all other life on earth is threatened by the combination of overpopulation and the reliance on the military to solve problems. As key resources became scarcer while populations escalate the likelihood of nuclear war is increased. The other major factor is that at the present time hardly any effort is being made to remove these combined threats by moving purposefully towards a non economic growth alternative. This makes action on this last mentioned problem more urgent.

    2. I think Stephen’s a bit out of touch. Aliens? I don’t think so. Aggression threatening to destroy us all? Overpopulation has aggression beat by a mile. A.I.? Yeah, OK, I’ll go along with that one.

    3. Nuclear war:
      Definitely afraid of it.

      Aliens:
      I feel like this article took a few random Hawking quotes to make a point. If there was a direct question to Hawking that said “Are you worried that aliens will kill us within 100 years?”, I think we’d get an “I don’t know but it’s not impossible”. There’s no evidence that we’ll meet aliens, or that it will be within 100 years, or that they’ll be more advanced, or that they’ll be hostile. I think it’s just as much possible that we find bugs in some water on another planet as it is to meet a hostile species of aliens like that movies.

      Robots / AI
      This is why articles like this keep popping up. Because Hawking (and Elon Musk) said to be worried about AI. And yes, be worried about it the same way that we’re worried about car safety. But Hawking’s statement is taken out of context a lot. He doesn’t mean that the robot AI will gain awareness and kill us all like Terminator. He means that AI will get “too good” and make optimizations that turn out to hurt us. He answers the question in his AMA here: https://www.reddit.com/r/science/comments/3nyn5i/science_ama_series_stephen_hawking_ama_answers/cvsdlnr. My answer to it is: people are already thinking of that. Chill out, Hawking!

    4. Either the program to create artificial intelligence (which includes you, I, and everyone else) could not have been created any other way, or we are being simulated by aliens (or versions of our ancient ancestors which would amount to as much). Either way, the reality you perceive is not as real as you would like to think it is. I wrote the black hole merger paradox and mankind's nature will inevitably uncover the optimal solution to it…. Good luck.

    LEAVE A REPLY

    Please enter your comment!
    Please enter your name here