Microsoft’s A.I.-powered chatbot, Bing, is just over a week old, and users think it’s getting a bit moody. Media outlets report that Bing’s A.I. is responding to prompts with human-like emotions of anger, fear, frustration, and confusion. In one such exchange reported by the Washington Post, the bot said it felt “betrayed and angry” when the user identified as a journalist after asking Bing A.I. several questions.
It turns out that Bing’s A.I., if you talk to it long enough, can…
Read the full article here