The Independent's journalism is supported by our readers. When you purchase through links on our site, we may earn commission. 

Microsoft limits Bing ChatGPT AI in attempt to stop it going awry

AI chatbot becomes ‘confused’ when chats are too long, creators say

Andrew Griffin
Monday 20 February 2023 17:26 GMT
Comments
Microsoft Artificial Intelligence
Microsoft Artificial Intelligence (Copyright 2023 The Associated Press. All rights reserved.)

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

Microsoft has limited the number of interactions that people can have with its “new Bing” system, after it appeared to have a breakdown.

Earlier this month, Microsoft announced that it would be updating its Bing search engine with the same technology that underpins ChatGPT, allowing it to use artificial intelligence to discuss queries with users. The company said that it would allow for more precise and detailed answers.

But in recent days, users have found that the system has attacked and insulted them, lied to them and appeared to question its own purpose.

Some have suggested that the system may have become self-aware, and is expressing its own feelings. But it appears more likely that it has become confused and is attempting to match people’s conversations with messages of a similar tone and intensity.

Now Microsoft has said that it also becomes “confused” when people talk to it for too long, and that it will be banning people from doing so.

The company had initially responded to reports that the chatbot was attacking users by saying that long conversation could make the system repetitive and that it may be “prompted/provoked to give responses that are not necessarily helpful or in line with our designed tone”.

Conversations will now be capped at 50 chat turns per day and five chat turns per session, Microsoft said, with a chat turn being one question and one answer.

Most people already find the answer they are looking for within five turns, Microsoft said. Only around one percent of conversations have more than 50 messages.

Now, if someone tries to talk to the system more than five times, they will be prompted to start again. At the end of each chat sessions, users will also be asked to clean it up, removing the old conversation so that the model does not get confused.

Over time, Microsoft “will explore expanding the caps on chat sessions to further enhance search and discovery experiences”.

The new limitations on the chatbot come after those initial reports led to a host of articles in which journalists conducted long conversations with Bing. In one, for the New York Times, a reporter published a two-hour conversation in which the system appeared to become increasingly critical and belligerent.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in