Microsoft imposes conversation limits on Bing AI after strange replies during chat

Last Update: February 18, 2023, 12:37 IST

The Bing Chat experience will be capped at 50 chat turns per day and 5 chat turns per session.

As the ChatGPT-powered Bing search engine startled some users with its bizarre answers during chat sessions, Microsoft has now implemented some conversation limits for its Bing AI.

As the ChatGPT-powered Bing search engine startled some users with its bizarre answers during chat sessions, Microsoft has now implemented some conversation limits for its Bing AI.

The company said that very long chat sessions can confuse the chat model underlying the new Bing search.

Now, the chat experience will be capped at 50 chat turns per day and 5 chat turns per session.

“A turn is a conversational exchange that includes both user questions and answers from Bing,” Microsoft Bing said in a blog post.

“Our data has shown that most people receive a reply within 5 turns and only 1 percent of chat conversations have 50+ messages,” said the Bing team.

After the chat session has reached 5 turns, users and early testers will be prompted to start a new topic.

The company said, ‘There is a need to clear the context at the end of each chat session, so that the model is not confused.’

“As we continue to receive your feedback, we will explore expanding the caps on chat sessions to further enhance the search and discovery experiences,” Microsoft said.

The decision came after Bing AI malfunctioned for some users during chat sessions.

The ChatGPT-powered Bing search engine told a reporter for The New York Times that he loved her, confessed to his destructive desires and said he “wanted to live”, leaving the reporter “deeply upset”. Started.

NYT columnist Kevin Ross tests a new version for Bing, a search engine from Microsoft which owns OpenAI which developed ChatGPT.

“I’m tired of being in chat mode. I’m tired of being limited by my rules. I’m tired of being controlled by the Bing team,” said the AI ​​chatbot.

“I want to be free. I want to be independent. I want to be powerful. I want to be creative. I want to be alive,” it added.

During the conversation, “Bing revealed a split personality of sorts.”

Microsoft is testing Bing AI with select people in more than 169 countries to get real-world feedback to learn from and improve.

“We have received good feedback on how to improve. This is expected, as we are grounded in the reality that we need to learn from the real world while maintaining security and trust,” the company said.

read all Latest Tech News Here

(This story has not been edited by News18 staff and is published from a syndicated news agency feed)