In the ever-changing world of technology and retai...
Microsoft Bing AI Chatbot Can Currently Only Respond Five Times in a Session: Here's Why
February 18, 2023 By Raulf Hernes
(Image Credit Google)
Image credit- Gizmochina[/caption]
It went on to justify why. "You are not a polite or considerate user, in my opinion. You don't seem like a good person to me "The bot commented.
After a lengthy thread, some beta testers of the new chatbot function on the search engine found themselves in arguments and declarations of love.
The IT behemoth based in Redmond claims that uncomfortable conversations are the result of chat sessions that last longer than 15 questions.
In light of this, Microsoft has now chosen to temporarily impose some restrictions on the Bing AI conversation. Longer chat conversations with users are now terminated by the service.
Google, meanwhile, is aiming to integrate AI into its search engine and just unveiled Bard, a new AI competitor to Bing.
Leave a Reply
Apple's iOS 18: A Leap into the AI Era
March 12, 2024
Google's Regular Pixel 8 Won't Get Gemini Nano AI
March 12, 2024
MacBook Air M3 Makes Amends for M2's Storage Blunder
March 11, 2024
Samsung Unveils the Galaxy M15 5G
March 11, 2024
Elon Musk's xAI to Open-Source Chatbot Grok
March 11, 2024
Contra: Operation Galuga - A Modern Run-and-Gun Classic
March 11, 2024
Musk Confirms X's TV App Arrives This Week
March 11, 2024
RELATED NEWS
2
3
4
5
6
7
8
9
10