Comment by throwaway17_17

1 month ago

I know this reply is late, but what the hell.

Your comment is certainly correct and I agree that the various implementations of LLM probably can not actually partition attempts to find proofs into any given logical system.

My comment was more tongue in cheek than your response. I phrased it awkwardly obviously. I was humorously hoping that at some fundamental level, involving unintentionally taking advantage of a previously unknown foundational rule of computer science, that LLM’s as ‘thinking’ algorithms simply could not understand or utilize non-constructive logical means to formulate a proof.

As I said, I did not think this is actually what’s going on with GPT not being able to actually, or convincingly, ‘understand’ the law of the excluded middle. It was more of backhanded insult at LLMs, particularly, and those sales people who want to talk about them as thinking, reasoning, semi-conscious algorithmic ‘beings’.