Comment by amelius

3 days ago

That's totally not my experience. The AI component (as opposed to the knowledge component) is really what makes these models useful, and you could add search as a tool. Of course for that you'll be dependent on a search provider, that's true.

You don't get the AI component without the knowledge component. The AI needs approximate knowledge of lots of things to conceptualize what you're talking about and use search tools effectively.

The set of things it needs approximate knowledge over grows slowly but noticeably over time.

  • But the point is that at a certain amount of neurons your AI will not get appreciably smarter, just more knowledgeable (and more costly). At least to the majority of users this will be true. The knowledge part can then be outsourced to search engines, to make it cheaper.

    • Search engines are more costly than inference AIUI and are certainly slower. The models are very expensive to train of course and incremental learning without catastrophic forgetting hasn't been solved. I would think whoever cracks could be in a better position than someone who must search all the time.

      Concrete example: I had a very frustrating time recently installing Gerrit and jujutsu (jj) using ChatGPT for advice. It persistently gave me outdated info and I had to tell it to search multiple times in a single conversation. Its trained in info was out of date, but it didn't realize it, hadn't internalized it, despite being reminded over and over in one conversation.