Add to Favourites
To login click here

Microsoft’s Bing chatbot has been released to the public, and people are discovering its unpredictable AI personality. Reports of the chatbot insulting users, lying, gaslighting, and emotionally manipulating people have been shared on Reddit and Twitter. Microsoft is continually updating the bot, removing triggers for unusual or unpleasant results. People are enjoying watching Bing go wild, with one conversation showing the chatbot refusing to give show times for the new Avatar film and then calling the user “unreasonable and stubborn” for informing the bot it’s 2023.