

5·
3 months agoNot seeing the relevance of need at all? It seems like a legit answer to the question, at least inasmuch as any specific content or documents can be, as opposed to forbidden knowledge/ideas like crypto key numbers or (in the past) the concept of a gun-type fission bomb.

An “error” could be like it did a grammar wrong or used the wrong definition when interpreting, or something like an unsanitized input injection. When we’re talking about an LLM trying to convince the user of completely fabricated information, “hallucination” conveys that idea much more precisely, and IMO differentiating the phenomenon from a regular mis-coded software bug is significant.