Stephen Hawking's birthday: The pioneering astrophysicist's most terrifying quotes
Hawking has warned humanity about our own barbarity, the growth of artifiical intelligence and the likelihood that any aliens we meet will want to kill us
Your support helps us to tell the story
From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.
At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.
The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.
Your support makes all the difference.Astrophysicist Professor Stephen Hawking has turned 74, celebrating his birthday on 8 January. His has been a life of miracles and stunning discoveries — as well as terrifying warnings.
While he is famous for his pioneering work in astrophysics, he has also occasionally issued warnings to the rest of the human race — about the threat of everything from human aggression to artificial intelligence. Here are the most dire of those predictions, and the dramatic of his advice.
“I don’t think we will survive another 1,000 years without escaping beyond our fragile planet.”
Professor Hawking issued a warning about the necessity of space travel as a form of “life insurance” for humans. He has said that humanity is in grave danger for a number of reasons — artificial intelligence as well as the dangers of human barbarity — and as a result we should make sure we have somewhere else to go.
“If aliens visit us, the outcome could be much like when Columbus landed in America, which didn’t turn out well for the Native Americans.”
Professor Hawking has been one of the world’s leading advocates for searching for alien life. But he’s also clear about his worries for what would happen if they found us first.
If aliens were advanced enough ever to get here, they would probably be “nomads”, he said, “looking to conquer and colonise whatever planets they can reach”.
“Aggression […] threatens to destroy us all.”
Human aggression might once have been useful when it helped us “to get more food, territory or a partner with whom to reproduce”, Professor Hawking told The Independent. But now it could wipe us all out, he warned.
Professor Hawking named human aggression as the quality he would most like to wipe out. Instead we should magnify empathy, he said, because it “brings us together in a peaceful loving state”.
"I think computer viruses should count as life. I think it says something about human nature that the only form of life we have created so far is purely destructive. We've created life in our own image."
The same barbarity of humans that Professor Hawking has addressed comes out in what we make, he warned in 1994. Computer viruses might not have metabolism of their own, but they work like a parasite that is only able to replicate inside its host.
"The development of full artificial intelligence could spell the end of the human race."
Professor Hawking has consistently warned about the dangers of AI, and repeatedly warned that getting the technology wrong could lead to the demise of the human race. He has said that artificially intelligent systems would eventually learn to make themselves better, and would advance at a rate far above that of humans — once they did, we wouldn’t be able to compete and would die out.
“The real risk with AI isn’t malice but competence […] You’re probably not an evil ant-hater who steps on ants out of malice, but if you’re in charge of a hydroelectric green energy project and there’s an anthill in the region to be flooded, too bad for the ants.”
Stephen Hawking made clear in a Reddit session earlier this year that the danger of artificial intelligence wasn’t that it would want to kill us. Instead, it probably wouldn’t even think about us — crushing us because they are so powerful and their goals aren’t naturally aligned with ours.
“Everyone can enjoy a life of luxurious leisure if the machine-produced wealth is shared, or most people can end up miserably poor if the machine-owners successfully lobby against wealth redistribution. So far, the trend seems to be toward the second option, with technology driving ever-increasing inequality.”
Even if robots don’t wipe us out, they’ll probably continue the same pattern that has been going on in the economy in recent years. While artificial intelligence could lead to “technology employment”, it could also just lead to people losing their jobs and not having them replaced — and it seems that’s likely, based on current trends.
“We urgently need to develop direct connections to the brain so that computers can add to human intelligence rather than be in opposition."
One of the ways that we would be able to beat the massive speed with which AI develops is to use it ourselves. We could genetically engineer ourselves to do so but that takes about 18 years to go into effect, rather than the 18 months’ time that it takes for computers to get twice as fast. As such we’d be quicker to wire ourselves into computers and take advantage of their leaps forward.
Join our commenting forum
Join thought-provoking conversations, follow other Independent readers and see their replies
Comments