Microsoft CEO warns tech industry of creating dystopian future from George Orwell's '1984'

'There are unintended consequences of technology'

Aatif Sulleyman
Wednesday 10 May 2017 16:53 BST
Comments
Facebook has come under heavy fire recently, for failing to adequately combat the spread of fake news
Facebook has come under heavy fire recently, for failing to adequately combat the spread of fake news (istock / ilbusca)

Microsoft’s CEO has warned the technology industry against creating a dystopian future, the likes of which have been predicted by authors including George Orwell and Aldous Huxley.

Satya Nadella kicked off the the company’s 2017 Build conference with a keynote that was as unexpected as it was powerful.

He told the developers in attendance that they have a huge responsibility, and that the choices they make could have enormous implications.

“I’m an unrepentant tech optimist, there’s no question of that,” said Mr Nadella. “But I’m also grounded. There are unintended consequences of technology.

“And it’s not that we can just use more technology to solve those problems, and technologies by themselves cannot solve these. But I do believe that it’s up to us to ensure that some of the more dystopian scenarios don’t come true.”

At this point in the speech, the main screen in the conference hall displayed Orwell’s 1984 and Huxley’s Brave New World.

“[Think about] what Orwell prophesied in 1984, where technology was being used to monitor, control, dictate,” continued Mr Nadella. “Or what Huxley imagined we may do by just distracting ourselves without any meaning or purpose.

“Neither of these futures is something that we want. So the question is: what are we going to do? Are there practical ways we can make progress?”

According to Nadella, technology should be inclusive and empower people, but build trust too.

“I think it starts with us taking accountability. Taking accountability for the algorithms we create, the experiences that we create, and ensuring that there is more trust in technology with each day.”

Facebook has come under heavy fire recently, amid widespread belief that fake news stories spread on its platform manipulated public opinion and influenced the 2016 US presidential election.

The company’s algorithms also failed to spot graphic footage of murders that had been posted on the site. The videos were eventually pulled, but only after hundreds of thousands of people had watched them.

“We want to think about people,” concluded Mr Nadella. “But we also want to think about the institutions people build.”

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in