← Back to context

Comment by vmg12

2 days ago

It's pretty simple, don't give llms access to anything that you can't afford to expose. You treat the llm as if it was the user.

> You treat the llm as if it was the user.

That's not sufficient. If a user copies customer data into a public google sheet, I can reprimand and otherwise restrict the user. An LLM cannot be held accountable, and cannot learn from mistakes.

I get that but just not entirely obvious how you do that for the Notion AI.

  • Don't use AI/LLMs that have unfettered access to everything?

    Feels like the question is "How do I prevent unauthenticated and anonymous users to use my endpoint that doesn't have any authentication and is on the public internet?", which is the wrong question.