← Back to context

Comment by throwaway13337

3 days ago

The real big deal about 'claws' in that they're agents oriented around the user.

The kind of AI everyone hates is the stuff that is built into products. This is AI representing the company. It's a foreign invader in your space.

Claws are owned by you and are custom to you. You even name them.

It's the difference between R2D2 and a robot clone trying to sell you shit.

(I'm aware that the llms themselves aren't local but they operate locally and are branded/customized/controlled by the user)

Yet the Claw is powered by an LLM provider whose underlying model may not align with your priorities? Do I understand that correctly?

  • That's right. And don't forget that the chips it runs on are manufactured by companies I might not agree with. Nor the mining companies that got the metal. Nor the energy company that powers it.

    The wonderful thing about markets that work is that you can swap things out without being under their boot.

    I worry about a LLM duopology. But as long as open weight models are nipping at their heels, it is the consumer that stands to benefit.

    The train we're on means a lot of tech companies will feel a creative destruction sort of pain. They might want to stop it but are forced by the market to participate.

    Remember that Google sat on their AI tech before being forced to productize it by OpenAI.

    In a working market, companies are forced to give consumers what they want.

    • > And don't forget that the chips it runs on are manufactured by companies I might not agree with. Nor the mining companies that got the metal. Nor the energy company that powers it.

      You see that this is a non sequitur right? No matter who makes the chips or mines the metal or supplies the power, the behavior of the thing won't be affected. That isn't the case when we're talking about who's training the LLM that's running your shit.

      3 replies →

    • > The wonderful thing about markets that work is that you can swap things out without being under their boot.

      This is an illusion. You literally describe Zizek's "Desert of the real": Billionaires own the illusion and you are telling me I get to pick from a selection of choices carefully curated and presented to me.

    • >Remember that Google sat on their AI tech before being forced to productize it by OpenAI.

      Google knew this tech wasn't ready for prime-time, they already had plenty of revenue and didn't need to release shoddy shit, but were forced to roll out "AI" even with "hallucinations" and the resulting liabilities to keep up with the new hotness. The tech is still so shoddy, I can't believe people use it for anything beyond a curiosity.

    • > In a working market, companies are forced to give consumers what they want.

      I want personal nuclear weapons, so the market hasn't been working for me. Time to roll back those pesky laws, regulations, and ethical boundaries. Prosecute executives who won't give me what I want.

      4 replies →

This is the framing most people in this thread are missing. The difference between "agent that does stuff for a company" and "agent that does stuff for me" isn't technical, it's about who defines the tool chain and who the agent is optimizing for. Every multi-agent system I've built has this tension buried in it. The agent's behavior changes dramatically based on whose goals are baked into the system prompt vs whose goals come in at runtime.

I agree, and it seems like the incumbents in this user-oriented space (OS vendors) would be letting the messy, insecure version play out before making an earnest attempt at rolling it into their products.

Well we are early. Big tech will make it more convenient, free and then they can inject ads etc.

It always depends on who you consider the user. The one who initiated the agent, or the one who interacts with it? Is the latter a user or a victim?