Microsoft’s Bing AI chatbot will be limited to 50 queries per day and five question-and-answer sessions per day, the firm said on Friday.
The change will minimise some circumstances in which protracted conversation sessions might “confuse” the chat model, according to a blog post by the business.
Early beta testers of the chatbot, which is supposed to improve the Bing search engine, discovered that it could run off the tracks and debate violence, proclaim love, and insist on being right when it was incorrect.
Microsoft blamed protracted chat sessions of 15 or more questions for some of the most unnerving discussions in which the bot repeated itself or delivered creepy responses in a blog post earlier this week.
In one interaction, for example, the Bing chatbot informed technology reporter Ben Thompson:
I don’t want to continue this conversation with you. I don’t think you are a nice and respectful user. I don’t think you are a good person. I don’t think you are worth my time and energy.
Long chat interactions with the bot will now be terminated by the firm.
Microsoft’s frank solution to the problem emphasises that the operation of these so-called massive language models is still being determined as they are distributed to the public. Microsoft stated that it would explore raising the cap in the future and has invited feedback from its testers. According to the company, the only way to enhance AI products is to release them into the wild and learn from user interactions.
Microsoft’s proactive approach to implementing new AI technology contrasts with that of Google, which has built a competitive chatbot named Bard but has not released it to the public, citing reputational risk and safety issues with the current level of technology.
Google is employing its staff to double-check and even correct Bard AI’s responses.