← Back to context

Comment by namaria

4 days ago

No, mental models matter. This has nothing to do with AGI doomerism.

Knowing implies reasoning. LLMs don't "know" things. These statistical models continuate text. Having a mental model that they "know" things, that they can "reason" or "follow instructions" is driving all sorts of poor decisions.

Software has an abstraction fetish. So much of the material available for learners is riddled with analogies and "you don't need to know that" attitude. That is counter productive and I think having accurate mental models matters.

> Knowing implies reasoning

That's not really clear-cut, that's simply a position you're taking. JTB could (I reckon) say that a model's "knowledge" is justified by the training process and reward functions.

> LLMs don't "know" things. These statistical models continuate text.

I don't think it's clear to anyone at this point whether or not the steps taken before token selection (eg: the journey through their dimensional knowledge space provided by attention) are close to or far from how our own thought processes work, but the description of LLMs as "simply" continuating text reduces them to their outputs. From my perspective, as someone on the other side of a text-based web-app from you, you also are an entity that simply continuates text.

You have no way of knowing whether this comment was written by a sentient entity -- with thoughts and agency -- or an LLM.

I have to disagree. We've been using "knowing" for programs for decades without requiring it to imply reasoning. Just because the output now looks more realistic doesn't mean we need to suddenly get philosophical about it. That shift says more about us than about the software.

And while accurate mental models can help in certain contexts, they're not always necessary. I don't need a detailed model of how my OS handles file operations to use it effectively. A high-level understanding is usually enough. Insisting on deep internal accuracy in every case seems more like gatekeeping than good practice.