Comment by chrisischris

8 days ago

Been building Spore, a distributed AI inference platform where you run models on your own hardware and access them from anywhere. The idea is you keep data control while participating in a credit economy to access more powerful models when needed.

Major updates over the last 6 weeks: Built a cross-platform CLI for running your own inference node. Handles auth, model management, and serving through a single command.

Shipped an OpenAI-compatible API with streaming and tool calling. Added web search to the chat interface.

Rewrote the job queue to guarantee FIFO ordering per model+context and fixed a race condition causing out-of-order assignments. Added circuit breakers for connection stability.

Wrapping up alpha testing in the next couple weeks, then inviting people from the email list into beta: https://sporeintel.com