← Back to context

Comment by acidburnNSA

4 hours ago

* "Self-hosted: Runs entirely on your infrastructure. No data leaves your network."

* "Bring Your Own LLM: Anthropic, OpenAI, Gemini, or open-weight models via vLLM."

With so many newbies wanting these kinds of services it might be worth adjusting the first bullet to say: "No data leaves your network, at least as long as you don't use any Anthropic, OpenAI, or Gemini models via the network of course"

That's a good point, it might make sense to clarify that for individuals who want to self-host. I'll make the change, thanks!

Most organizations are going to be self hosting on aws, gcp or azure... So as long as you use their inference services as your LLM then you can keep it all within the private network

  • Even self-hosting on AWS, GCP, or Azure isn't local enough for certain application, such as people doing export-controlled work where any sysadmin or person with physical access to the server/data is required to be a US Person (or equivalent in other countries). This is the niche that the govcloud solutions are aimed at serving. But some people just want to build big actually-private, actually self-hosted systems and do their own physical and network security.

  • Exactly, enterprise customers almost always use private model endpoints on their cloud provider for any serious deployments. Data stays within the customer's VPC, data security and privacy is guaranteed by the cloud providers.