Comment by throwa356262

19 hours ago

"LLM backends: Anthropic, OpenAI, OpenRouter."

And here I was hoping that this was local inference :)

Sure. Why purchase a H200 if you can go with an ESP32 ^^

  • Blowing more than 800kb on essentially an http api wrapper is actually kinda bad. The original Doom binary was 700kb and had vastly more complexity. This is in C after all, so by stripping out nonessential stuff and using the right compiler options, I'd expect something like this to come in under 100kb.

    • Doom had the benefit of an OS that included a lot of low-level bits like a net stack. This doesn’t! That 800kB includes everything it would need from an OS too.

      2 replies →

    • > vastly more complexity.

      Doom is ingenious, but it is not terribly complex IMHO, not compared to a modern networking stack including WiFi driver. The Doom renderer charm is in its overall simplicity. The AI is effective but not sophisticated.

    • The whole ESP32 Libraries are kind of bloated. To enable Bluetooth, WiFi or HTTP handling, you need to embed some large libraries

    • yeah i sandbagged the size just a little to start (small enough to fit on the c3, 888 picked for good luck & prosperity; I even have a build that pads to get 888 exactly), so i can now try reduce some of it as an exercise etc.

      but 100kb you’re not gonna see :) this has WiFi, tls, etc. doom didn’t need those

haha well I got something ridiculous coming soon for zclaw that will kinda work on board.. will require the S3 variant tho, needs a little more memory. Training it later today.

  • Sounds interesting, please keep us posted.

    I dont think I have an S3, but plenty of C3. I thought they had the same amount of memory

right, 888 kB would be impossible for local inference

however, it is really not that impressive for just a client

  • It's not completely impossible, depending on what your expectations are. That language model that was built out of redstone in minecraft had... looks like 5 million parameters. And it could do mostly coherent sentences.

    •   > built out of redstone in minecraft
      

      Ummm...

        > 5 million parameters
      

      Which is a lot more than 888kb... Supposing your ESP32 could use qint8 (LOL) that's still 1 byte per parameter and the k in kb stands for thousand, not million.

      5 replies →

  • I disagree, in the future it might be possible. But perhaps not in English, but in some more formal (yet fuzzy) language with some basic epistemology.

    I mean, there is a lambda calculus self-interpreter in 29 bytes. How many additional logical rules are required for GAI inference? Maybe not that many as people think. Understanding about 1000 concepts of basic english (or say, lojban) might well be sufficient. It is possible this can be encoded in 800kB, we just don't know how.