Comment by HPsquared
3 months ago
It's a plausible-sounding list, but that's just exactly the kind of thing a hallucinating LLM would produce when asked the question. It's hard to know how real these types of "introspection" prompts are - not just on this LLM but in general.
No comments yet
Contribute on Hacker News ↗