Comment by benreesman
3 days ago
I'm already doing almost all my LLM-assisted programming in Haskell, all the devops in check-heavy Nix, and starting to move everything into Dhall.
There's no way that Python written by LLMs will survive contact with Haskell written by LLMs in a standup competition.
One imagines the scope for this sort of thing goes very high in formality.
> There's no way that Python written by LLMs will survive contact with Haskell written by LLMs in a standup competition.
That's likely true for formal correctness.
But what about the disparity between python's large and mature ecosystem compared the skimpiness of Haskell's libraries? How is an LLM going to compensate for Haskell's lack of math, AI/ML, graphics, web, and other key infrastructures?
An LLM writing Haskell must produce new code where production-grade libraries don't exist, and that Haskell app will be bigger than python's. That Haskell app will be relatively unverified compared to an LLM-written python app, simply because all that missing library code is brand new and hasn't proven itself the real world.
What about the disparity in LLM training data across many application domains? Given little Haskell training data, an LLM will be more likely to hallucinate and will require more human iterations before producing a production-ready app.
> One imagines the scope for this sort of thing goes very high in formality.
Haskell's type system is great, but only as far as it goes.
I've found Hackage to just completely shit on PyPI in terms of:
- there's a good library - its relatively clear which 2 or 3 represent standard choices - its going to be well-tested including performance
The Haskell library ecosystem is a lot like the Python one with the bottom 80-90℅ lowest quality stuff sliced off. There's one of two things and they work. I was needing an HTTP2 client the other day, which a good one? Long pull in any language. Haskell has it of course.
You're right on data science stuff but that's everythinf: Julia and R get killed on libraries in data science. Lattner is working on that problem and he's not nailed it yet, its a "hard for Chris Lattner" problem.
One thing that might be throwing you off (it misled me at first) is that when a Haskell library has had very little work for a few years? That usually means its done, it changes when the world of foobar changes and only when the world of foobar changes.
There are exceptions to this, I wish effects and optics libraries had more consensus, but mostly there's a right one and it works great.
Yeah, it makes me sad that the skill of LLMs at a language is directly proportional to the popularity of a language, which is itself inversely proportional to the formal guardrails placed on the language by a type system.
Serious question: Why not have it generate assembly?
Because that is a movement in the opposite direction from the one yielding the improved results. Putting an LLM on formal system guardrails is the best of both worlds: high tempature stochastic generation gated by theory-laden checks.
Having a high temperature model operate with no safety net more than a segfault sounds like the worst of both worlds.