Add to Favourites
To login click here

Hallucinate is a term used to describe AI engines, like OpenAI’s ChatGPT, that have a tendency to make up stuff that isn’t true but that sounds true. This is due to neural machine translation systems, or NMTs, being “susceptible to producing highly pathological translations that are completely untethered from the source material”. Hallucinations are unexpected and incorrect responses from AI programs that can arise for reasons that are not yet fully known. This can include making up scholarly citations, lying about data, or making up facts about events that aren’t in its training data.