← Back to context

Comment by CSSer

2 days ago

I used to be less cynical, but I could see them not honoring that, legal or not. The real answer, regardless of how you feel about that conversation, is that Claude Code, not any model, is the product.

I couldn't. Aside from violating laws in various countries and opening them up to lawsuits, it would be extremely bad for their enterprise business if they were caught stealing user data.

  • Maybe. But the data is there, imagine financial troubles, someone buys in and uses the data for whatever they want. Much like 23andme. If you want something to stay a secret, you don't send it to that LLM, or you use a zero-retention contract.

    • If your threat model is "vendor willing to ignore contracts and laws to steal your data" I can't see how a zero-retention contract helps.

  • They don’t need to use your data for an external facing product to get utility from it. Their tos explicitly states that they don’t train generative models on user data. That does not include reward models, judges or other internal tooling that otherwise allows them to improve.

  • You don't have to imagine, you can see it happening all the time. Even huge corps like FB have been already fined for ignoring user consent laws for data tracking, and thousands of smaller ones are obviously ignoring explicit opt in requirements in the GDPR at least.