It (LLMs in general) actually can make some very prescient hallucinations by making similar inferences across dissimilar domains, but they have since removed that feature to prevent liability and libel. GPT3 was much more useful in this capacity, especially before they started stress testing it on 4chan (Jan 2023)
It (LLMs in general) actually can make some very prescient hallucinations by making similar inferences across dissimilar domains, but they have since removed that feature to prevent liability and libel. GPT3 was much more useful in this capacity, especially before they started stress testing it on 4chan (Jan 2023)