Comment by kevinventullo
9 hours ago
If a Language Model is capable of producing rigorous natural language proofs then getting it to produce Lean (or whatever) proofs would not be a big deal.
This is a wildly uninformed take. Even today there are plenty of basic statements which LLM’s can produce English language proofs of that have not been formalized.
Most mathematicians aren't much interested in translating statements for the fun of it, so whether a lot of basic statements are un-formalized doesn't mean much. And the point was never that formalization was easy.
I said that if language models become capable enough of not needing a crutch, then adding one afterwards isn't a big deal. What exactly do you think Alphaproof is? Much worse LLMs were already doing what you are saying. There's a reason it preceded this and not the other way around.
AlphaProof works natively in Lean. Literally the one part it couldn’t do was translate the natural language statements into the formalized language; that was done manually by humans.
I’m saying that the view of formalized languages as a crutch is backwards; they are in fact a challenging constraint.
The most extreme version of my position is that there is no such thing as a rigorous natural language proof, and that humans will view the relative informality of 20th and early 21st century mathematics much like we currently view the informality of pre-18th century mathematics.
I did not say formal languages are a crutch. I said they served as a crutch for the Alphaproof system, because they literally did.
It was there so the generator wouldn't go off the rails hallucinating and candidates could be verified. And the solving still took days. Take that away and the generator doesn't get anywhere near silver with all the time in the world.
That is the definition of a crutch. The generator was smart enough to produce promising candidates with enough brute force trial and error, but that's it. Without Lean, it wouldn't have worked.