Comment by tough

6 months ago

We cannot blame the tools for how they are used by those yielding them.

I can use ChatGPT to teach me and understand a topic or i can use it to give me an answer and not double check and just copy paste.

Just shows off how much you care about the topic at hand, no?

If you used ChatGPT to teach you the topic, you'd write your own words.

Starting the answer with "I asked ChatGPT and it said..." almost 100% means the poster did not double-check.

(This is the same with other systems: If you say, "According to Google...", then you are admitting you don't know much about this topic. This can occasionally be useful, but most of the time it's just annoying...)

How do you know that ChatGPT is teaching you about the topic? It doesn't know what is right or what is wrong.

  • It can consult any sources about any topic, ChatGPT is as good at teaching as the pupil's capabilities to ask the right questions, if you ask me

    • I like to ask AI systems sports trivia. It's something low-stakes, easy-to-check, and for which there's a ton of good clean data out there.

      It sucks at sports trivia. It will confidently return information that is straight up wrong [1]. This should be a walk in the park for an LLM, but it fails spectacularly at it. How is this useful for learning at all?

      [1] https://news.ycombinator.com/item?id=43669364

      1 reply →

    • It may well consult any source about the topic, or it may simply make something up.

      If you don't know anything about the subject area, how do you know if you are asking the right questions?

      2 replies →

We can absolutely blame the people selling and marketing those tools.

  • Yeah, marketing always seemed to me like a misnomer or doublespeak for legal lies.

    All marketing departments are trying to manipulate you to buy their thing, it should be illegal.

    But just testing out this new stuff and seeing what's useful for you (or not) is usually the way