Without a ‘world government’ technology will destroy us, says Stephen Hawking

'This aggression may destroy us all by nuclear or biological war. We need to control this inherited instinct by our logic and reason'

Aatif Sulleyman
Wednesday 08 March 2017 13:11 GMT
Comments
Mr Hawking has previously said that AI could grow so powerful it would be capable of killing us entirely unintentionally
Mr Hawking has previously said that AI could grow so powerful it would be capable of killing us entirely unintentionally (Getty)

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

Stephen Hawking has warned that technology needs to be controlled in order to prevent it from destroying the human race.

The world-renowned physicist, who has spoken out about the dangers of artificial intelligence in the past, believes we need to establish a way of identifying threats quickly, before they have a chance to escalate.

“Since civilisation began, aggression has been useful inasmuch as it has definite survival advantages,” he told The Times.

“It is hard-wired into our genes by Darwinian evolution. Now, however, technology has advanced at such a pace that this aggression may destroy us all by nuclear or biological war. We need to control this inherited instinct by our logic and reason.”

He suggests that "some form of world government” could be ideal for the job, but would itself create more problems.

“But that might become a tyranny," he added. “All this may sound a bit doom-laden but I am an optimist. I think the human race will rise to meet these challenges.”

In a Reddit AMA back in 2015, Mr Hawking said that AI would grow so powerful it would be capable of killing us entirely unintentionally.

“The real risk with AI isn't malice but competence,” Professor Hawking said. “A super intelligent AI will be extremely good at accomplishing its goals, and if those goals aren't aligned with ours, we're in trouble.

“You're probably not an evil ant-hater who steps on ants out of malice, but if you're in charge of a hydroelectric green energy project and there's an anthill in the region to be flooded, too bad for the ants. Let's not place humanity in the position of those ants.”

Tesla CEO Elon Musk shares a similar viewpoint, having recently warned that humans are in danger of becoming irrelevant.

“Over time I think we will probably see a closer merger of biological intelligence and digital intelligence,” he said, suggesting that people could merge with machines in the future, in order to keep up.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in