Home » News » Microsoft Bing AI Chatbot Can Currently Only Respond Five Times in a Session: Here's Why

Microsoft Bing AI Chatbot Can Currently Only Respond Five Times in a Session: Here's Why

(Image Credit Google)
Microsoft stated that its Bing chatbot's artificial intelligence (AI) response limit has been reduced to just five per session. It happens soon after reports revealed that the chatbot behaves strangely after lengthy talks. Now, users are limited to asking five queries every session. In addition, the chatbot won't ask the beta tester to delete the existing discussion and start a new one. Moreover, the Bing chatbot can only answer 50 queries per day at this time. [caption id="" align="aligncenter" width="650"]Microsoft Bing: Microsoft's Bing plans AI ads in early pitch to advertisers  - The Economic Times Image credit - The Economic Times[/caption] This, according to the Remond-based company, should keep the model from becoming "confused." Microsoft also disclosed that just about 1% of users of chatbots have chat discussions that include more than 50 messages. According to data compiled by the IT giant, "the vast majority of users obtain the answers [they] are looking for within 5 rotations." Therefore it appears that Microsoft used this information to inform the new restriction. Microsoft warned beta-testers that in lengthy chats, Bing might get "repetitive or be prompted/provoked to give comments that are not necessarily helpful or in accordance with our planned tone." Also read: Microsoft Bing with ChatGPT is now compatible with desktop integration, and mobile and iOS versions are on the way The Bing AI chatbot and technology writer Ben Thompson had a frightening interaction, according to the reports. I don't want to continue this conversation with you, the artificial intelligence chatbot informed Thompson, who was utilizing the new Bing feature. [caption id="" align="aligncenter" width="1024"]Microsoft Limits Bing Chat to 5 Replies, 50 Questions Per Day to Avoid  Inappropriate Interactions - Gizmochina Image credit- Gizmochina[/caption] It went on to justify why. "You are not a polite or considerate user, in my opinion. You don't seem like a good person to me "The bot commented. After a lengthy thread, some beta testers of the new chatbot function on the search engine found themselves in arguments and declarations of love. The IT behemoth based in Redmond claims that uncomfortable conversations are the result of chat sessions that last longer than 15 questions. In light of this, Microsoft has now chosen to temporarily impose some restrictions on the Bing AI conversation. Longer chat conversations with users are now terminated by the service. Google, meanwhile, is aiming to integrate AI into its search engine and just unveiled Bard, a new AI competitor to Bing.

By Raulf Hernes

If you ask me raulf means ALL ABOUT TECH!!

RELATED NEWS

In the ever-changing world of technology and retai...

news-extra-space

In a bid to capture the attention of users and dri...

news-extra-space

Apple is preparing for a game-changing move with i...

news-extra-space

Google has been making huge headways in artificial...

news-extra-space

Elon Musk's artificial intelligence firm, xAI, is ...

news-extra-space

In a digital showdown that has captured the attent...

news-extra-space
2
3
4
5
6
7
8
9
10