Microsoft Says Talking To Bing For Too Long Can Cause It To Go Off the Rails
jeudi 16 février 2023, 15:00 , par Slashdot
Microsoft has responded to widespread reports of Bing's unhinged comments in a new blog post. From a report: After the search engine was seen insulting users, lying to them, and emotionally manipulating people, Microsoft says it's now acting on feedback to improve the tone and precision of responses, and warns that long chat sessions could cause issues. Reflecting on the first seven days of public testing, Microsoft's Bing team says it didn't 'fully envision' people using its chat interface for 'social entertainment' or as a tool for more 'general discovery of the world.' It found that long or extended chat sessions with 15 or more questions can confuse the Bing model. These longer chat sessions can also make Bing 'become repetitive or be prompted / provoked to give responses that are not necessarily helpful or in line with our designed tone.'
Microsoft hints that it may add 'a tool so you can more easily refresh the context' of a chat session, despite there being a big 'new topic' button right next to the text entry box that will wipe out the chat history and start fresh. The bigger problem is that Bing can often respond in the incorrect tone during these longer chat sessions, or as Microsoft says, in 'a style we didn't intend.' Microsoft claims this will take a lot of prompting for most Bing users to run into these issues, but the company is looking at more 'fine-tuned control' to avoid issues where Bing starts telling people they're wrong, rude, or manipulative.
Read more of this story at Slashdot.
56 sources (32 en français)
sam. 10 juin - 19:50 CEST