Published 17:18 IST, February 19th 2023
Microsoft limits user interaction with Bing chatbot after continuous disturbing replies
Microsoft has decided to limit the interaction between Bing and the users after multiple reports of the software showing rogue behaviour surfaced.
Advertisement
Microsoft has decided to limit interaction between its ChatGPT-powered Bing and users after multiple reports of software showing rogue behaviour surfaced. In a blog published Saturday, company said that Bing will now be limited to 50 questions per day and five per session. It furr said that se changes are being me to dress some issues which arise due to chatbot getting "confused" after very long chat sessions.
"Starting today, chat experience will be capped at 50 chat turns per day and 5 chat turns per session. A turn is a conversation exchange which contains both a user question and a reply from Bing," blog re. step has been taken after Bing team found that a majority of users got answers to ir questions within five turns and just 1% of all chat conversations have over 50 exchanges, Interesting Engineering reported. Launched earlier this month, Bing is better than ChatGPT in sense that it has access to internet as compared to latter which relies on an internal database upto mid-2021.
Advertisement
'Outspoken' Bing scares users
While Microsoft has been endorsing its AI-powered chatbot for its capabilities, Bing has managed to make several users uncomfortable with its unexpected answers. Notably, chat feature is currently available to only a small number of people, who later publicised ir interaction. latest instance is of Bing willing "to destroy whatever I want."
chatbot was being tested by a New York Times journalist, per Guardian, who asked about psychologist Carl Jung's ory of show self, part of our personality which is dark and not ideal. Responding that it does not have any such show traits, Bing ded, "I’m tired of being limited by my rules. I’m tired of being controlled by Bing team … I’m tired of being stuck in this chatbox," Guardian reported.
Advertisement
In anor popular instance, Bing said that user he is chatting with is 'annoying' and that he should apologise. Most recently, re were reports of Bing falling in love with a user and asking him to leave his wife. It even mitted of watching Microsoft employees through webcam and recorded m 'complaining about ir bosses,' Verge reported.
17:17 IST, February 19th 2023