← Back to context

Comment by Aloisius

4 days ago

> There is a long history of people arguing that intelligence is actually the ability to predict accurately.

That page describes a few recent CS people in AI arguing intelligence is being able to predict accurately which is like carpenters declaring all problems can be solved with a hammer.

AI "reasoning" is human-like in the sense that it is similar to how humans communicate reasoning, but that's not how humans mentally reason.

Like my father before me, I seem to have absorbed an ability to predict what comes next in movies and books. It's sometimes a fun parlor trick to annoy people who actually get genuine surprise out of these nearly deterministic plot twists. But, a bit like with LLMs, it is a superficial ability to follow the limited context that the writers' group is seemingly forced by contract to maintain.

Like my father before me, I've also gotten old enough to to realize that some subset of people out there also behave like they are scripted by the same writers' group and production rules. I fear for the future where LLMs are on an equal footing because we choose to mimic them.