← Back to context

Comment by echoangle

11 days ago

As others have already said, think about what you're doing when you use this.

If you connect a not-selhosted LLM to this, you're effectively uploading chat message with other people to a third-party server. The people you chat with have an expectation of privacy so this would probably be illegal in many jurisdictions.

Except basically all of Europe is one-party consent, and things like tech support call centres are already doing variants of this for years.

  • One-party-consent only means you can legally record something, it doesn't necessarily mean that you're allowed to share it with (non-government) third parties later.

    It could be legal to record and use as evidence in court later, but that doesn't mean you're allowed to share it with some AI company.

    • They TOS utilisation of the data under 'Quality and Training purposes', with implied consent by engagement with the service in question - the breadth and application of which has never had a test case to my knowledge.

Your information is gone the moment you utter words. I can also copy and paste the messages people send me.

  • > I can also copy and paste the messages people send me.

    Sure you can, but the people can sue you if you paste it into something public. I don't know if you're making some deep philosophical comment but this is something people have been sued and lost for before.

I would argue that there is no expectation of privacy for messaging apps without end to end encryption. There is always the man in the middle listening.

  • Legally, there absolutely is. Because by law, the messaging app operator also can't just publish the stuff you write in a chat. Even some disclaimer in the terms of service probably wouldn't work if people would generally assume the chat to be private.

    And it also doesn't even matter because WhatsApp claims to be E2E-encrypted.

  • Meta claims WhatsApp is end-to-end encrypted.

    It's up to you to trust Meta or not, but people who trust them do have an expectation of privacy.

  • That's irrelevant here because the OP is running the LLM on one of the ends, so it's decrypted that same as when you're reading the chat convo yourself.

    It also misses the mark because you're talking about an eavesdropper intercepting messages and the OP is the receiver sharing the messages with a third party themself.

> The people you chat with have an expectation of privacy so this would probably be illegal in many jurisdictions.

Name one

  • Germany.

    You have a "allgemeines Persönlichkeitsrecht" (general personal rights?) that prevents other people from publishing information that's supposed to be private.

    Here's a case where someone published a facebook dm for example:

    https://openjur.de/u/636287.html

    • How would this stand up to the "I didn't do it, I probably got hacked!" defense? It's one thing to publish personal conversation, and another to have your conversations aggregated by some LLM (and if they leak plain-text, the "hacked" defense is even more plausible).

      4 replies →

    • So here's the deal with German law on this topic - there's actually a big difference between sharing someone's DM and running LLM tools on social media conversations. The OLG Hamburg case from 2013 (case number 7 W 5/13) establishes that publishing private messages without permission violates your personality rights ("allgemeines Persönlichkeitsrecht"). While we don't have specific LLM court rulings yet, German data protection authorities have been addressing AI technologies under GDPR principles. The Bavarian Data Protection Authority (BayLDA) and the Hamburg Commissioner for Data Protection have both issued opinions that automated AI processing of personal communications requires explicit legal basis under Article 6 GDPR, unlike simple sharing which falls under personality rights law. The German Federal Commissioner for Data Protection (BfDI) has indicated that LLM processing would likely be evaluated based on purpose limitation, data minimization, and transparency requirements. In practice, this means LLM tools could legally process conversations if they implement proper anonymization techniques, provide clear user notices, and follow purpose limitations - conditions not required for the simpler act of sharing a message. The German courts distinguish between publishing content (governed by personality rights) and processing data (governed by data protection law), creating different standards for each activity. While the BGH (Federal Court) hasn't ruled specifically on LLMs, their decisions on automated data processing indicate they would likely allow such processing with appropriate safeguards, whereas unauthorized DM sharing remains almost always prohibited under personality rights jurisprudence regardless of technical implementation.

      5 replies →

    • That case describes publishing this to the public internet. I don't believe the same would apply when using a tool like this.

      My family members all back up our conversations to Google Drive, I doubt WhatsApp would provide that feature if it were illegal.

      3 replies →