← Back to context

Comment by no_wizard

5 days ago

>Asking the LLM is a vastly superior experience.

Not to be overly argumentative, but I disagree, if you're looking for a deep and ongoing process, LLMs fall down, because they can't remember anything and can't build upon itself in that way. You end up having to repeat alot of stuff. They also don't have good course correction (that is, if you're going down the wrong path, it doesn't alert you, as I've experienced)

It also can give you really bad content depending on what you're trying to learn.

I think for things that represent themselves as a form of highly structured data, like programming languages, there's good attunement there, but you start talking about trying to dig around about advanced finance, political topics, economics, or complex medical conditions the quality falls off fast, if its there at all

I used llms to teach me a programming language recently.

It was way nicer than a book.

That's the experience I'm speaking from. It wasn't perfect, and it was wrong sometimes, sure. A known limitation.

But it was flexible, and it was able to do things like relate ideas with programming languages I already knew. Adapt to my level of understanding. Skip stuff I didn't need.

Incorrect moments or not, the result was i learned something quickly and easily. That isn't what happened in the 90s.

  • > and it was wrong sometimes, sure. A known limitation.

    But that's the entire problem and I don't understand why it's just put aside like that. LLMs are wrong sometimes, and they often just don't give you the details and, in my opinion, knowing about certain details and traps of a language is very very important, if you plan on doing more with it than just having fun. Now someone will come around the corner and say 'but but but it gives you the details if you explicitly ask for them'. Yes, of course, but you just don't know where important details are hidden, if you are just learning about it. Studying is hard and it takes perseverance. Most textbooks will tell you the same things, but they all still differ and every author usually has a few distinct details they highlight and these are the important bits that you just won't get with an LLM

    • It's not my experience that there are missing pieces as compared to anything else.

      Nobody can write an exhaustive tome and explore every feature, use, problem, and pitfall of Python, for example. Every text on the topic will omit something.

      It's hardly a criticism. I don't want exhaustive.

      The llm taught me what I asked it to teach me. That's what I hope it will do, not try to caution me about everything I could do wrong with a language. That list might be infinite.

      6 replies →

Most LLM user interfaces, such as ChatGPT, do have a memory. See Settings, Personalization, Manage Memories.

  • Sure, but there are limits here. Thats what I'm talking about, limits. The memory isn't infinitely expansive. I still have found it doesn't backtrack well enough to "remember" (for lack of a better term) that it told me something already, if its old enough, for example.

    It also doesn't seem to do a good job of building on "memory" over time. There appears to be some unspoken limit there, or something to that affect.