Comment by biddit
18 hours ago
I have a bespoke local agent that I built over the last year, similar in facilities to Moltbot, but more deterministic code.
Running it this kind of agent in the cloud certainly has upsides, but also:
- All home/local integrations are gone.
- Data needs to be stored in the cloud.
No thanks.
There's a hidden trade-off here: Latency vs Privacy
A local agent has zero ping to your smart home and files, but high latency to the outside world (especially with bad upload speeds). A cloud agent (Cloudflare) has a fat pipe to APIs (OpenAI/Anthropic) and the web, but can't see your local printer.
The ideal future architecture is hybrid. A dumb local executor running commands from a smart cloud brain via a secure tunnel (like Cloudflare Tunnel). Running the agent's brain locally is a bottleneck unless you're running Llama 3 locally
This is ultimately the first question I have whenever someone tells me about a bouncing new AI shiny... "Where does my data go?" Because if it does not stay on my machine, hard pass.
What kind of hardware do you need, and how is it compared to the cloud agents?