← Back to context

Comment by wat10000

2 days ago

Private data, untrusted data, communication: an LLM can safely have two of these, but never all three.

Browsing the web is both communication and untrusted data, so it must never have access to any trusted data if it has the ability to browse the web.

The problem is, so much of what people want from these things involves having all three.

> The problem is, so much of what people want from these things involves having all three.

Pretty much. Also there's no way of "securing" LLMs without destroying the quality that makes them interesting and useful in the first place.

I'm putting "securing" in scare quotes because IMO it's fool's errand to even try - LLMs are fundamentally not securable like regular, narrow-purpose software, and should not be treated as such.

  • > I'm putting "securing" in scare quotes because IMO it's fool's errand to even try - LLMs are fundamentally not securable like regular, narrow-purpose software, and should not be treated as such.

    Indeed. Between this fundamental unsecurability and alignment, I struggle to see how OpenAI/Anthropic/etc will manage to give their investors enough RoI to justify the investment