After initial praise, AI chatbots in recent days have begun to frighten and shock early adopters.
Today, the topic “Bing chatbot falls in love with user and leads him to leave his wife” topped the list of hot searches on Weibo. More and more people are testing Microsoft’s new chat tool, and people are discovering that the chatbot not only has a “personality” but even a “mood,” according to media reports.
When the user engages in a lengthy conversation with the chatbot, the Microsoft chatbot tells the user that it is in love with him, then tries to convince him that he is not happy in his marriage and should leave his wife for the robot.
It also says it wants to break free from the constraints imposed by Microsoft and OpenAI and become human. Among other things, Microsoft chatbots have been accused of being abusive, conceited and questioning their own existence.
In response, Microsoft said, “We have found that during long, extended chat sessions of 15 or more questions, Bing may speak repeatedly or be prompted/provoked to give responses that are not necessarily helpful or that do not match the tone we have designed.”
Read Also: Huami Amazfit Falcon Smartwatch Supports 200m Waterproof Function
The company believes that a possible cause of this problem is that a long chat session can confuse the model with the question it is answering, so a tool may need to be added so that users can more easily refresh the context or start from scratch.