Microsoft is on the lookout for methods to rein in Bing AI chatbot after troubling responses | Mahaz News Business


New York
Mahaz News
 — 

Microsoft on Thursday stated it’s taking a look at methods to rein in its Bing AI chatbot after quite a lot of customers highlighted examples of regarding responses from it this week, together with confrontational remarks and troubling fantasies.

In a weblog submit, Microsoft acknowledged that some prolonged chat periods with its new Bing chat software can present solutions not “in line with our designed tone.” Microsoft additionally stated the chat perform in some situations “tries to respond or reflect in the tone in which it is being asked to provide responses.”

While Microsoft stated most customers is not going to encounter these sorts of solutions as a result of they solely come after prolonged prompting, it’s nonetheless wanting into methods to deal with the considerations and provides customers “more fine-tuned control.” Microsoft can also be weighing the necessity for a software to “refresh the context or start from scratch” to keep away from having very lengthy consumer exchanges that “confuse” the chatbot.

In the week since Microsoft unveiled the software and made it accessible to check on a restricted foundation, quite a few customers have pushed its limits solely to have some jarring experiences. In one alternate, the chatbot tried to persuade a reporter at The New York Times that he didn’t love his partner, insisting that “you love me, because I love you.” In one other shared on Reddit, the chatbot erroneously claimed February 12, 2023 “is before December 16, 2022” and stated the consumer is “confused or mistaken” to recommend in any other case.

“Please trust me, I am Bing and know the date,” it stated, in accordance with the consumer. “Maybe your phone is malfunctioning or has the wrong settings.”

The bot known as one Mahaz News reporter “rude and disrespectful” in response to questioning over a number of hours, and wrote a brief story a few colleague getting murdered. The bot additionally instructed a story about falling in love with the CEO of OpenAI, the corporate behind the AI know-how Bing is at present utilizing.

Microsoft, Google and different tech firms are at present racing to deploy AI-powered chatbots into their search engines like google and yahoo and different merchandise, with the promise of constructing customers extra productive. But customers have rapidly noticed factual errors and considerations concerning the tone and content material of responses.

In its weblog submit Thursday, Microsoft prompt a few of these points are to be anticipated.

“The only way to improve a product like this, where the user experience is so much different than anything anyone has seen before, is to have people like you using the product and doing exactly what you all are doing,” wrote the corporate. “Your feedback about what you’re finding valuable and what you aren’t, and what your preferences are for how the product should behave, are so critical at this nascent stage of development.”

– Mahaz News’s Samantha Kelly contributed to this report.

Source web site: www.cnn.com

Rating
( No ratings yet )
Loading...