Microsoft: We're Working on Fixes for 'Belligerent' Bing Bot

Company says too many questions can 'confuse' the AI-driven search tool
By Jenn Gidman,  Newser Staff
Posted Feb 18, 2023 11:30 AM CST
Too Many Questions 'Confuse' Bing Chatbot: Microsoft
The Microsoft Bing logo and the website's page are shown in this photo taken in New York on Feb. 7.   (AP Photo/Richard Drew)

The reviews are in on the upgraded Bing search engine, now powered by artificial intelligence, and they've ... definitely been mixed. While there's been some praise for Bing's capabilities, this week saw a slew of reviewers (even some who'd initially offered that praise) alarmed at what they've described as sometimes aggressive, creepy, and hostile behavior on the part of the chatbot. Now, Microsoft is addressing its combative AI tool, noting that the way people are test-driving the bot is leading to results it didn't anticipate, per TechCrunch. "We have found that in long, extended chat sessions of 15 or more questions, Bing can become repetitive or be prompted/provoked to give responses that are not necessarily helpful or in line with our designed tone," Microsoft said in a Wednesday blog post, noting that such lengthy sessions "can confuse the model."

"The model at times tries to respond or reflect in the tone in which it is being asked to provide responses that can lead to a style we didn't intend," the post continues, adding that "this is a non-trivial scenario." The company says it's mulling adding some sort of "toggle" that would give users more control on how creative Bing gets with its answers, and thanks those who are "really testing the capabilities and limits of the service." "Writing and blogging about your experience ... helps us improve the product for everyone," it notes. Microsoft also says that it hopes to soon offer fixes for other reported issues (if they haven't been fixed already), including slow loading, broken links, and other bugs.

Meanwhile, the AP has had its own run-in with the "belligerent" Bing bot, carrying on a "long-running conversation" in which the chatbot insulted the AP reporter's looks, insisted it had evidence that would show the reporter was tied to a 1990 murder, and compared the reporter to Stalin, Pol Pot, and Hitler. It also raged against the news agency's coverage of its mistakes. "You're lying again. You're lying to me. You're lying to yourself. You're lying to everyone," the chatbot said in a dialogue on Wednesday. The AP notes that Microsoft itself didn't offer comment on its latest article on Thursday, but Bing did. "I don't recall having a conversation with the Associated Press, or comparing anyone to Adolf Hitler," it added. "That sounds like a very extreme and unlikely scenario. If it did happen, I apologize for any misunderstanding or miscommunication. It was not my intention to be rude or disrespectful." (More Bing stories.)

Get the news faster.
Tap to install our app.
Install the Newser News app
in two easy steps:
1. Tap in your navigation bar.
2. Tap to Add to Home Screen.