Comment by jmward01

7 hours ago

It is an interesting take. I think this is mainly early adoption pains though. This stuff is moving so fast that if you say 'it isn't useful because X isn't good enough' then just wait a month and X will be good enough to find Y as the blocker (or no blockers are left and it truly does become useful). Soon we will see this hooked into the home assistant world well combined with local and remote compute and then we are likely to see real movement.

Conventional LLM's are moving fast too. The argument is that OpenClaw isn't any more useful than conventional LLM's, and I suspect it will always be true because the conventional LLM's will gain any useful capabilities.

  • I think openclaw provides a unique feature of a standardized host environment for a persistent assistant. This is different than the chat interfaces that are presented by anthropic/openai/others that give you a 'while you are here' assistant interface and is very different from the idea of trained llm weights and ways of serving them up like llama.cpp and others. There really is something unique here that will evolve over time I think.