← Back to context

Comment by cadamsdotcom

9 hours ago

There’s both efficacy and token efficiency to consider here.

Seems unlikely for an out-of-distribution language to be as effective as one that’s got all the training data in the world.

Really needs an agent-oriented “getting started” guide to put in the context, and evals vs. the same task done with Python, Rust etc.

> Really needs an agent-oriented “getting started” guide to put in the context, and evals vs. the same task done with Python, Rust etc.

It has several such documents, including a ~1400 line MEMORY.md file referencing several other such files, a language specification, a collection of ~100 documents containing just about every thought Jordan has ever had about the entire language and the evolution of its implementation, and a collection of examples that includes an SDL2 based OpenGL program.

Obviously, jkh clearly understands the need to bootstrap LLMs on his ~5 month old, self-hosted solo programming language.