Comment by Yokohiii
17 hours ago
An LLMs "wrong" decision is either systemic or biased. They learn "common sense" from human input (i.e. shared datasets, reinforcement learning). If a decision is flat out wrong for you, asking 10 LLMs is unlikely to help.
No comments yet
Contribute on Hacker News ↗