Comment by palata

10 months ago

> the exact inverse of how I most often use AI, which is to throw a ton of information at it in a large prompt

It sounds to me that you don't make the effort to absorb the information. You cherry-pick stuff that pops in your head or that you find online, throw that into an LLM and let it convince you that it created something sound.

To me it confirms what the article says: it's not worth reading what you produce this way. I am not interested in that eloquent text that your LLM produced (and that you modify just enough to feel good saying it's your work); it won't bring me anything I couldn't get by quickly thinking about it or quickly making a web search. I don't need to talk to you, you are not interesting.

But if you spend the time to actually absorb that information, realise that you need to read even more, actually make your own opinion and get to a point where we could have an actual discussion about that topic, then I'm interested. An LLM will not get you there, and getting there is not done in 2 minutes. That's precisely why it is interesting.

You're making a weirdly uncharitable assumption. I'm referring to information which I largely or entirely wrote myself, or which I otherwise have proprietary access to, not which I randomly cherry-picked from scattershot Google results.

Synthesizing large amounts of information into smaller more focused outputs is something LLMs happen to excel at. Doing the exact same work more slowly by hand just to prove a point to someone on HN isn't a productive way to deliver business value.

  • > Doing the exact same work more slowly by hand just to prove a point to someone on HN isn't a productive way to deliver business value.

    You prove my point again: it's not "just to prove a point". It's about internalising the information, improving your ability to synthesise and be critical.

    Sure, if your only objective is to "deliver business value", maybe you make more money by being uninteresting with an LLM. My point is that if you get good at doing all that without an LLM, then you become a more interesting person. You will be able to have an actual discussion with a real human and be interesting.

    • Understanding or being interesting has nothing to do with it. We use calculators and computers for a reason. No one hires people to respond to API requests by hand; we run the code on servers. Using the right tool for the job is just doing my job well.

      8 replies →

  • > deliver business value.

    I think that mindet directly correlates with the kind of AI hat prompted this article: "It doesn't matter" in your eyes. You don't see the task as important, only the output and that it makes you money. the craft is less important than what you can sell it for.

    • They're just different use cases. There's a difference between a learning exercise and a contractual engagement to deliver a product to a client.

      2 replies →