← Back to context

Comment by GTP

2 years ago

It's true that there are open-source models that one can run locally, but the problem is also how many people are going to do that. You can make the instructions inside a GitHub README as clear and straightforward as you want, but I think that for the majority of people, nothing will beat the convenience of whatever big corporation's web application. For many the very thing that a product is made by a famous company is a reason to trust it more.

This gets missed in a lot of conversations about privacy (because most conversations about privacy are among pretty technical people). The vast majority of people have no idea what it means to set up your own local model, and of those that do, fewer still can/will actually do it.

Saying that there's open-source models so AI privacy is not an issue is like saying that Google's not a privacy problem because self-hosted email exists.

  • Private LLMs are really not more complicated than installing an app. But I expect all web browsers and operating systems will sport a local model in the near future, so it will be available out of the box. As for adoption, it's the easiest interface ever invented.

    • > But I expect all web browsers and operating systems will sport a local model in the near future

      Yes, and maybe with some kind of "telemetry" to help the developers or other users ;)

    • > Private LLMs are really not more complicated than installing an app.

      Most people install apps through an app store.

    • > it's the easiest interface ever invented.

      I'm not sure about this at all. It looks like a clumsy, difficult interface to me. The easiest interface wouldn't require multiple rounds of back-and-forth to get the results you're seeking.

    • You need a well performing always-on PC, at least. Preferably it needs to be securely accessible from a mobile device. Less than 1 percent of all people have that.