Microsoft responds after users of new Bing chatbot complain about its latest behaviour
Your support helps us to tell the story
From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.
At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.
The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.
Your support makes all the difference.Microsoft has responded to users of its new Bing chatbot, who have complained that limitations intended to stop the system going awry have also made it boring.
In the days since Microsoft announced that it was integrating ChatGPT technology into its search engine, many users have conducted long conversations with the system – and been left disturbed by what it has said. Users have described its chats as “unhinged”, and reported that it had attacked and lied to them.
In response, Microsoft said that many of those unusual remarks by its chatbot came in long conversations for which it had not been built. It said that such long chats “confused the underlying model”, and said that they represented only a very small number of such conversations.
In an attempt to reduce the number of those interactions, Microsoft said that it would be limiting chats to five “turns” per session and a total of 50 each day. It encouraged users to clear out their history regularly so that the chatbot would forget old conversations and not become confused.
“This was in response to a handful of cases in which long chat sessions confused the underlying model,” Microsoft said.
“These long and intricate chat sessions are not something we would typically find with internal testing. In fact, the very reason we are testing the new Bing in the open with a limited set of preview testers is precisely to find these atypical use cases from which we can learn and improve the product.”
After it announced those changes, however, many users complained that the shorter chats had ruined the fun of the system, and that Microsoft had reduced the number of ways that it could be used. On Reddit, for instance, users complained that the new Bing had become little more than an enhanced search engine.
The company said that it had received feedback directly that people wanted longer chats so that they could “both search more effectively and interact with the chat feature better”.
Over time, Microsoft hopes “to bring back longer chats and are working hard as we speak on the best way to do this responsibly”. Initially, it will increase the limits to six turns per session and 60 per day, with a view to increasing the daily cap to 100 chats “soon”.
But Microsoft also intends to add an option that will let users control how the system deals with their questions. It suggested that it will add something like a slider, so that users can choose a more “creative” option if they want the kinds of in-depth discussions that have gone viral in recent days.
“We are also going to begin testing an additional option that lets you choose the tone of the Chat from more Precise – which will focus on shorter, more search focused answers – to Balanced, to more Creative – which gives you longer and more chatty answers. The goal is to give you more control on the type of chat behavior to best meet your needs,” Microsoft said.
Join our commenting forum
Join thought-provoking conversations, follow other Independent readers and see their replies
Comments