A reporter talks to a Microsoft chatbot for more than two hours. At some point in the conversation, the program „confesses“ its love to the journalist. For the Microsoft corporation, this is reason to drastically limit the speaking time with immediate effect.

Microsoft has restricted the use of its Bing chatbot, which can answer complex questions and conduct extensive conversations with the help of artificial intelligence. The software company is thus reacting to several incidents in which the text robot got out of hand and formulated answers that were perceived as encroaching and inappropriate.

A test of the Bing chatbot by a reporter from the New York Times caused a stir on the web. In a dialogue lasting more than two hours, the chatbot claimed that it loved the journalist. It then asked the reporter to break up with his wife.

In a blog post, the company announced it would now limit Bing chats to 50 questions per day and five per session. „Our data has shown that the vast majority of people find the answers they’re looking for within 5 rounds,“ the Bing team said. Only about one percent of chat conversations contain more than 50 messages, it added. When users reach the limit of five entries per session, Bing will prompt them to start a new topic.

Microsoft had previously warned against engaging the AI chatbot, which is still in a testing phase, in lengthy conversations. Longer chats with 15 or more questions, according to the statement, could lead to Bing „repeating itself or being prompted or provoked into answers that aren’t necessarily helpful or don’t match our intended tone.“

Microsoft relies on technology from the startup OpenAI for its Bing chatbot and is supporting the California-based AI company with billions.