Comment by ai-christianson
6 days ago
For anyone looking at alternatives in this space - I built Gobii (https://gobii.ai) 8 months before OpenClaw existed. MIT licensed, cloud native, gVisor sandboxed.
The sandboxing part matters more than people think. Giving an LLM a browser with full network access and no isolation is a real security problem that most projects in this space hand-wave away.
Multi-provider LLM support (OpenAI, Anthropic, DeepSeek, open-weight models via vLLM). In production with paying customers.
Happy to answer architecture questions.
Looks good! I'm curious, are customers fine with their data going to third-party LLM providers?
I think this ship has sailed pretty hard, by now. Pretty much any app you can possibly use, from iTerm to Slack, is sending data to third-party LLMs (sometimes explicitly, most times as small features here and there)
Control of where data goes is always an option. People just need to make that choice.
Not sure what gives you that idea. One of our superpowers is that we're MIT licensed and deployable to private clouds, or even fully airgapped with 196gb+ of vram to run minimax on vllm + Gobii.