← Back to context

Comment by jumploops

9 days ago

One thing I’m personally excited about is the democratization of software via LLMs.

Unfortunately, if you go to ChatGPT and ask it to build a website/app, it immediately points the unknowing user towards a bunch of cloud-based tools like Fly.io, Firebase, Supabase, etc.

Getting a user to install a local DB and a service to run their app (god forbid, updating said service), is a challenge that’s complex, even for developers (hence the prevalence of containers).

It will take some time (i.e. pre-training runs), but this is a future I believe is worth fighting for.

> Unfortunately, if you go to ChatGPT and ask it to build a website/app, it immediately points the unknowing user towards a bunch of cloud-based tools like Fly.io, Firebase, Supabase, etc.

Not sure where your experience is coming from but when I asked an LLM, Claude to be more precise, it referred me to local options first, such as SQLite. It didn't consider cloud platforms at all until I had asked, presumably because it can understand local code and data (it can query it directly and get back results) but cannot understand the context of what's in the cloud unless you configure it properly and give it the env variables to query said data.

  • What was your prompt?

    In my experience it’s great at utilizing local storage and SQLite, if you ask it to.

    I just asked the ChatGPT web client (4o, as that’s what most non-developers might default to):

    > Can you build me a website for my photos

    And it immediately started suggesting Wordpress, Wix, Squarespace, etc.

    Specifically, this was section 4 of the various questions it asked me:

    > 4. Tech Preference (optional)

    > - Do you want this as a static HTML site, WordPress, or built with something like React, Next.js, or Wix/Squarespace? > - Do you need help hosting it (e.g., using Netlify, Vercel, or shared hosting)?

    As a non-programmer, I likely wouldn’t understand half those words, and the section is marked optional.

    If I follow the “default path” I’m quickly forking over a credit card and uploading my pictures of dogs/family/rocks to the cloud.

Local LLMs are even more amazing in concept, all of the world's knowledge and someone to guide you through learning it without needing anything but electricity (and a hilariously expensive inference rig) to run it.

I would be surprised if in a decade we won't have local models that are an order of magnitude better than current cloud offerings while being smaller and faster, and affordable ASICs to run them. That'll be the first real challenger to the internet's current position as "the" place for everything. The more the web gets enshittified and commercialized and ad-ridden, the more people will flock to this sort of option.