This is a special post for quick takes by Eggs. Only they can create top-level comments. Comments here also appear on the Quick Takes page and All Posts page.
Could part of the issue with hallucination in LLMs be that they are often trained on Internet conversations where no one is likely to chime in with "I don't know"?