Microsoft gets exclusive access to AI deemed 'too dangerous to release'

Microsoft says the AI could be used to help writing and composition, translating languages, and more

Adam Smith
Wednesday 23 September 2020 15:30 BST
Comments
(Mohammad Rezaie)
Leer en Español

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

Microsoft has an exclusive license to use OpenAI’s GPT-3 artificial intelligence language generator, the company has announced.

The previous iteration of GPT-3, called GPT-2, made headlines for being “too dangerous to release” and has numerous capabilities, including designing websites, prescribing medication, answering questions,  and penning articles.

Microsoft says it will “leverage its technical innovations to develop and deliver advanced AI solutions for our customers”, although was not specific about what that would be.

“The scope of commercial and creative potential that can be unlocked through the GPT-3 model is profound, with genuinely novel capabilities – most of which we haven’t even imagined yet”, wrote Kevin Scott, Microsoft’s executive vice president and chief technology officer. 

“Directly aiding human creativity and ingenuity in areas like writing and composition, describing and summarizing large blocks of long-form data (including code), converting natural language to another language – the possibilities are limited only by the ideas and scenarios that we bring to the table” he added.

OpenAI clarified on its own blog that the deal will not affect access to GPT-3 through OpenAI’s API, so exising and future users of the model will be able to continue to build applications.

It says its commercial model has received tens of thousands of applications. GPT-3 will also remain in a limited beta for academics to test the capabilities and limitations of the model.

Microsoft and OpenAI already have existing relationships; Microsoft Azure is the cloud computing service on which OpenAI trains its artificial-intelligence programs, and last year Microsoft became OpenAI’s exclusive cloud provider.

The reason that OpenAI’s program was deemed so dangerous was because it is capable of being fed a piece of text and predict the words that come next to such a high degree of accuracy that it would be difficult to distinguish between it and a human.

As such, it would be able to be abused by extremist groups to create "synthetic propaganda" for white supremacists or jihadist Islamis, for example.

"Due to our concerns about malicious applications of the technology, we are not releasing the trained model," wrote OpenAI in February 2019.

Some believed that the impressive capabilities of the algorithm meant that it would threaten industries or even show self-awareness.

However, OpenAI’s CEO Sam Altman has said that such exaggerations are just “hype”.

“It’s impressive … but it still has serious weaknesses and sometimes makes very silly mistakes” he added.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in