Just over a week after launching its AI-powered Bing search engine, Microsoft is working to fix problems with factual errors and weird conversations stemming from the ChatGPT-related technology under the hood.

In a blog post Wednesday, the software giant said longer chats can send Bing around the bend when it tries to reflect the tone of the human side of the conversation. It also acknowledged that Bing struggles to provide “answers that need very timely data,” such as live sports scores. And it said it’ll quadruple the amount of data the model uses to answer queries requiring hard facts, like numbers from financial reports.

“Lastly, we’re considering adding a toggle that gives you more control on the precision vs creativity of the answer to tailor to your query,” the company said in its post. Microsoft uses its in-house Bing technology to “ground” AI-boosted answers when accuracy is needed, but it relies on the language technology from partner and ChatGPT creator OpenAI when more “creative” responses are called for.

Bing’s new chat function, which puts the OpenAI technology front and center, can get weird in extended sessions of 15 or more questions, Microsoft said. The AI model gets confused or shifts its tone oddly. This is evident in a transcript of New York Times tech columnist Kevin Roose’s bizarre conversation with the AI, in which it confesses a desire to spread misinformation and become human, then tries to convince him to leave his marriage and be with it. People on the Bing forum on Reddit also spotlighted behavior showing Bing running amok.

There are also instances where it became defensive or refused to admit an error, according to Fast Company.

“This is a nontrivial scenario that requires a lot of prompting, so most of you won’t run into it, but we are looking at how to give you more fine-tuned control,” Microsoft said.

The company also promised to address technical issues like slow loading, broken links and incorrect formatting, and to look at adding new features like booking flights, sending email and sharing answers.

Editors’ note: CNET is using an AI engine to create some personal finance explainers that are edited and fact-checked by our editors. For more, see this post.



#Microsoft #Fix #Bing #AIs #Errors #Bizarre #Chats

Previous post Rock Climbing Is a Thrill. It’s Also Really Good for You
Next post 'It's happening!' – Manchester United fans react to cryptic Qatari takeover tweet