Microsoft is introducing some conversation restrictions for Bing AI. Bing chats will now be limited to 50 questions per day and five per session after the search engine was found to have misrepresented, misrepresented and emotionally manipulated users. Before that, the artificial intelligence of the search engine was unavailable several times. If users exceed the five message limit, Bing will prompt you to create a new topic to avoid long sessions.
“Our data showed that the vast majority of people find the answers they’re looking for within 5 turns, and that only about 1% of chat conversations contain more than 50 messages,” Bing said in a blog post.
Microsoft warned that longer chat sessions with 15 or more questions could cause Bing to repeat itself or generate answers that aren’t necessarily helpful or in the right tone.
Previously there were reports of “unbalanced” Bing conversations, and The New York Times published two-plus hours of correspondence with Bing, where the chatbot talked about how it loved the author, for some reason could not sleep at night, and pretended that others were just as intelligent. revelation”.
Microsoft is still working on improving the tone of Bing, the nature of the limitations of the model is not known and is not final. Regarding user sessions, new restrictions or changes are possible in the near future.
The company continues to work on improving Bing AI Chat, making small changes almost daily and releasing larger patches once a week. Microsoft’s statement that the company “does not fully represent” people who use the chat interface for “social entertainment” or as a tool for more “general discovery of the world” is interesting – it is quite naive to think that people will limit their communication with chatbots some simple templates.
Microsoft’s Bing AI also made unfortunate mistakes during demonstration and active use
Source: The Verge