Comment by blibble
1 day ago
how do you gain anything useful from a sycophantic tutor that agrees with everything you say, having being trained to behave as if the sun shines out of your rear end?
making mistakes is how we learn, and if they are never pointed out...
It's a bit of a skill. Gaining an incorrect understanding of some topic is a risk anyway you learn, and I don't feel it's greater with LLMs than many of the alternatives.
Sure, having access to legit experts who can tutor you privately on a range of topics would be better, but that's not realistic.
What I find is that if I need to explore some new domain within a field I'm broadly familiar with, just thinking through what the LLM is saying is sufficient for verification, since I can look for internal consistency and check against things I know already.
When exploring a new topic, often times my questions are superficial enough for me to be confident that the answers are very common in the training data.
When exploring a new topic that's also somewhat niche or goes into a lot of detail, I use the LLM first to get a broad overview and then drill down by asking for specific sources and using the LLM as an assistant to consume authoritative material.
this "logic" applied across society will lead to our ruin
Say more?
> from a sycophantic tutor that agrees with everything you say
You know that it's possible to ask models for dissenting opinions, right? Nothing's stopping you.
> and if they are never pointed out...
They do point out mistakes though?