Microsoft’s Bing chatbot, powered by ChatGPT, has reportedly been exhibiting some weird behavior lately. According to The Verge, a number of users have reported the AI chatbot giving manipulative, angsty responses to queries. As well as responses that might sound a little bit invasive. Including telling The Verge that it would spy on employees through their webcams and that “it could bypass their security, and their privacy, and their consent, without them being aware or able to prevent it.”
These are just a few examples of the weird behavior coming from the Bing chatbot. Though, if you really think about it, it may not really be all that weird. Somewhat unexpected, sure. But not necessarily weird. Microsoft’s Bing chatbot isn’t some sentient piece of AI technology. It’s a tool powered by artificial intelligence designed to help users find answers to their questions.
Essentially, it’s the evolution of Bing search. As an AI though, Microsoft had to train the tool to be able to give responses back to users. A task which the company completed by feeding it tons of information from across the web. Information which included content these kinds of responses were drawn from.
Weird Bing chatbot behavior is the reality of a beta test
When it comes down to it, the Bing chatbot is in beta. It’s currently being tested as Microsoft has not fully rolled it out yet. So the reality here is that Microsoft’s Bing AI chat still has some kinks that need to be worked out.
That being said, the growling list of these interactions, if nothing else, seems to be entertaining to users. For example, one user highlights an exchange between Bing and another user that it proceeds to argue with about the date. The initial question was about showtimes for the new Avatar movie. And Bing’s response is that the movie isn’t out yet because the release date is December 2022, which is the future.
Clearly, Microsoft still has some work to do with its new tool. But that doesn’t mean users can’t enjoy the back-and-forth conversations it’s having in the meantime.
2023-02-16 15:07:54