← Back to context

Comment by boredhedgehog

7 hours ago

> Then when it fails to apply the "reasoning", that's evidence the artificial expertise we humans perceived or inferred is actually some kind of illusion.

That doesn't follow, if the weakness of the model manifests on a different level we wouldn't call rational in a human.

For example, a human might have dyslexia, a disorder on the perceptive level. A dyslexic can understand and explain his own limitation, but that doesn't help him overcome it.

Typically when a human has a disorder or limitation they adapt to it by developing coping strategies or making use of tools and environmental changes to compensate. Maybe they expect a true reasoning model to be able to do the same thing?