Microsoft’s AI chatbot, Copilot, has been known to produce strange and inaccurate responses, also known as “hallucinations.” To combat this issue, Microsoft has implemented limits on chat turns and is working on grounding the AI model with accurate data from Bing search.