Comment by TekMol

8 days ago

Are there still reasons to use PostgreSQL?

I like the simplicity of SQLite's "a file is all you need" approach so much, that I started to converge all my projects to SQLite. So far, I have not come across any roadblocks.

Can anyone think of a use case where PostgreSQL is better suited than SQLite?

The biggest one is redundancy. Architecting with Read replicas is much easier with Postgres than Sqlite because of it's server model.

Sqlite on the server is a fantastic starter database. Dead simple to set up, highly performant and scales way higher (vertically) than anyone gives it credit for.

But there certainly is a point you'll have to scale out instead of up, and while there are some great solutions for that (rqlite, litefs, dqlite, marmot) it's not inherent to Sqlite's design.

  • Should replication really be a concern of the DB layer?

    Replication means writing queries which alter the data to multiple machines, right?

    Shouldn't that be done by a software one level up? Which takes in the queries via some network protocol and then sends them to all machines.

    That would sound more logical to me.

    • Historically, yes. Databases were software that were concerned with both storage and networking.

      It's fine to want to separate those out, but it's not easy to do so and there are reasons they've been coupled for decades.

      5 replies →

Sometimes you have applications that should not be able to access an entire database. There are other various scaling reasons, and PG extensions that can be helpful. But I agree that for small to medium sized projects, SQLite is highly underrated.

When your application scales beyond one machine that needs access to the same database, PostgreSQL becomes an obviously better choice than SQLite. Until that point, SQLite is a fine, and honestly underrated choice.

DuckDB is another option worth considering.

  • Should the concept of "machines" really be a concern of the DB layer?

    SQLite already allows multiple connections, so putting it on a server and adding a program that talks a network protocol and proxies the queries to the DB sounds more logical to me?

    • High performance software is written acknowledging the reality that it will run on hardware. Databases tend to be a class of software that is hyper-focused on performance.

      Writing a networked application that uses SQLite as a database is perfectly reasonable. You're just making the decision to lift the layer of abstraction that is concerned with machines from the DB to your application, which may or may not be a reasonable thing to do.

Certainly if you need a network-attached database and aren't creating your own home brew network-attached database (the so-called API server), Postgres is a pretty good choice.