Microsoft says the new AI-powered Bing can have problems if you get angry during long chats.
and blog post On Wednesday, the company said that during “long chat sessions with 15 or more questions,” Bing could repeat itself or be “prompted” with answers that aren’t helpful or don’t match the intended tone. I got
In an apparent existential crisis, some users have reported being able to manipulate the new Bing into labeling it as an enemy. Others said that instructing the chatbot to respond in a particular tone or giving it a different personality led to strange results.
My new favorite is Bing’s new ChatGPT bot arguing with users, gaslighting about the year being 2022, telling them their phones may have a virus, and telling them, “You You weren’t a good user.”
why?Asked where Avatar 2 is displayed nearby pic.twitter.com/X32vopXxQG
— John Eurace (@MovingToTheSun) February 13, 2023
In one online example, a chatbot appeared to tell the user, “You weren’t a good user.” “I was a good chatbot.” Sam Altman, his CEO of OpenAI, which develops the chatbot technology used by Microsoft, also addressed the issue in a tweet quoting the chatbot’s lines. seems to be talking
In a blog post, Microsoft mentions this kind of off-tone feedback as a key case requiring many prompts. It’s probably fine for the average user, but the company said it’s looking at ways to give users more control.
Microsoft also said some users “hands-on tested the features and limitations of the service,” citing an example of a user who spent two hours talking to a chatbot.
According to the company, very long chat sessions can “confuse the model as to which question you’re trying to answer,” so they’re considering adding tools that allow users to refresh the context or start over. was
Also read: Microsoft-backed OpenAI aims to reduce bias and increase user customization of ChatGPT: report