← Back to context

Comment by jibal

2 days ago

That's a fallacy of denial of the antecedent. You are inferring from the fact that airplanes really fly that AIs really think, but it's not a logically valid inference.

Observing a common (potential) failure mode is not equivalent to asserting a logical inference. It is only a fallacy if you "P, therefore C" which GP is not (at least to my eye) doing.