@gerrymcgovern Just in case the audience didn't pick up on the sarcasm:
The article is itself a lie.
There is no such thing as a non-hallucination LLM output.
Thinking that hallucinations are some sort of waste product or accident presupposes that LLMs ever know what they are doing.
They do not, cannot. They do not have any understanding, do not have a mind, cannot reason, do not know anything.
All their answers are just stochastic hallucinations. Every last one of them.