Yes a thousand percent! I'm working on this too. I'm sick of everyone trying to come up with a use case to get all my data in everyone's cloud so I have to pay a subscription fee to just make things work. I'm working on a fitness tracking app right now that will use the sublime model - just buy it, get updates for X years, sync with all your devices and use it forever. If you want updates after X years buy the newest version again. If its good enough as is - and that's the goal - just keep using it forever.
This is the model I want from 90% of the software out there, just give me a reasonable price to buy it, make the product good, and don't marry it to the cloud so much that its unusable w/out it.
There are also a lot of added benefits to this model in general beyond the data privacy (most are mentioned in the article), but not all the problems are solved here. This is a big space that still needs a lot of tooling to make things really easy going but the tech to do it is there.
Finally, the best part (IMHO) about local-first software is it brings back a much healthier incentive structure - you're not monetizing via ads or tracking users or maxing "engagement" - you're just building a product and getting paid for how good it is. To me it feels like its software that actually serves the user.
Obsidian the note taking app is a great model to follow as well. The client is completely free and they sell an optional syncing service. The notes are all on markdown files so the client is completely optional.
This is the reason I have always refused to use Bear note taking app irrespective of how good and snappy that app is. Because they keep their notes in a SQLite db now and even though that file can be backed up and handled locally my notes are not easily accessible to me. I can't easily edit my notes in other editors (which I often like to do on my mac), I can't version controlled backup and sync those files the way I want outside of iCloud (which is what Bear uses).
What is sad is that they used to be local files first note app and then they moved to sqlite citing some sync and performance issues.
The benefit of local-first means you’re not incentivized to sell your cloud offering, so you can just give options. Sync with iCloud, Google drive, OneDrive, Dropbox, Mega, SMB, SFTP, FTP, whatever you feel like adding support for. And since local-first usually means having some kind of sane file format, you can let “advanced” users manage their own files and synchronization like people have been doing for the last 50 years.
There are a lot of valid answers to this! One is to use your platform's provided one, like OneDrive or iCloud. Another is to integrate with some other sync platform. Dropbox is a popular target for this. Peer-to-peer is another, although that obviously also come with limitations. Finally, bring-your-own-sync is a popular choice amongst open-source apps, where you provide a self-hostable sync server.
Check out Aardvark (renamed to reflection) it's a collaborative note-taking app from the GNOME folks. I think the idea isn't to completely remove cloud infrastructure, but to at least make it optional and/or provide alternatives. For example, this note app works via P2P. blogs.gnome.org/tbernard/2025/06/30/aardvark-summer-2025-update/
For Joplin I use WebDav from the 10gb of free file storage that comes with Fastmail. So I have easy sync with multiple platforms and form factors, and even substantial notes make little dent in the allowance.
Ideally, you would use existing commodity infrastructure but we have found none of it is really super fit for our purposes. Failing that, we have been developing an approach to low-maintenance reusable infrastructure. For now, I would advise running your own but positioning yourself to take advantage of commodity systems as they emerge.
> get updates for X years, sync with all your devices and use it forever. If you want updates after X years buy the newest version again. If its good enough as is - and that's the goal - just keep using it forever.
While this sounds good deal, with this approach
- You have to charge total cost of subscription at once (1y or 2y),
- Still have to keep servers running for syncing, also you have think about cases where user syncing 1y of data in a single day.
- Have to keep people on the payroll for future developments.
(You are here thinking only in developer perspective.)
Had similar thoughts a few years back (https://rodyne.com/?p=1439) when considering worst case scenarios after a local factory lost two days production due to a server failure at an IT supplier.
A backend can be part of the functionality though, such as for real-time collaboration and syncing. But you can have ownership and longevity guarantees for both the data and the service as long as you can eject [1] from the cloud and switch to self-host or back at any time, which is what we do for our notes/tasks IDE
Totally agree. If you don't mind - what tech stack are you using for your fitness tracking app? I'm particularly curious about how you handle cross-device sync :)
Sure you could. I'm not, I don't think its in the spirit of local first. And I wouldn't pay money for that, but if you or someone else wants to build that kind of software - its a free world :)
Fitness data tells a lot, your health status, your daily schedule, with running/cycling/... your exact whereabouts that is quite some valuable information.
A notepad also isn't enough to correlate heart rate etc to specific exercises and plotting over time
> I'm sick of everyone trying to come up with a use case to get all my data in everyone's cloud so I have to pay a subscription fee to just make things work.
AI photo and video generation is impractical to run locally.
ComfyUI and Flux exist, but they serve a tiny sliver of the market with very expensive gamer GPUs. And if you wanted to cater to that market, you'd have to support dozens of different SKUs and deal with Python dependency hell. And even then, proficient ComfyUI users are spending hours experimenting and waiting for renders - it's really only a tool for niche artists with extreme patience, such as the ones who build shows for the Las Vegas Sphere. Not your average graphics designers and filmmakers.
I've been wanting local apps and local compute for a long time, but AI at the edge is just so immature and underpowered that we might see the next category of apps only being available via the cloud. And I suspect that these apps will start taking over and dominating much of software, especially if they save time.
Previously I'd only want to edit photos and videos locally, but the cloud offerings are just too powerful. Local cannot seriously compete.
But who said anything about AI? Lots of local-first apps have nor need any AI whatsoever. And by the way, Topaz Labs has good offerings for editing photos and videos with AI that run locally, works great for many use cases (although it's not fully generative like Veo etc, more like upscaling and denoising, which does use generative AI but not like the former).
There is now a great annual Local-first Software conference in Berlin (https://www.localfirstconf.com/) organised by Ink and Switch, and it's spawned a spin out Sync Conf this November in SF (https://syncconf.dev/)
There was a great panel discussion this year from a number of the co-authors of the the paper linked, discussing what is Local-first software in the context of dev tools and what they have learnt since the original paper. It's very much worth watching: https://youtu.be/86NmEerklTs?si=Kodd7kD39337CTbf
The community are very much settling on "Sync" being a component of local first, but applicable so much wider. Along with local first software being a characteristic of end user software, with dev tools - such as sync engines - being an enabling tool but not "local first" in as much themselves.
It's an exciting time for the local-first / sync engine community, we've been working on tools that enable realtime collaborative and async collaborative experiences, and now with the onset of AI the market for this is exploring. Every AI app is inherently multi user collaborative with the agents as actors within the system. This requires the tech that the sync engine community has been working on.
Anything with online dependencies will necessarily require ongoing upkeep and ongoing costs. If a system is not local-first (or ideally local-only), it’s not designed for long-term dependability.
Connected appliances and cars have got to be the stupidest bit of engineering from a practical standpoint.
The entire thing is because of subscription revenue.
It’s self reinforcing because those companies that get subscription revenue have both more revenue and higher valuations enabling more fund raising, causing them to beat out companies that do not follow this model. This is why local first software died.
I remember seeing somebody summarize this as "SaaS is a pricing model" or "SaaS is financialization" and it totally rings true. Compared to normal software pricing, a subscription gives you predictable recurring revenue and a natural sort of price discrimination (people who use your system more, pay more). It's also a psychological thing: folks got anchored on really low up-front prices for software, so paying $2000 for something up-front sounds crazy even if you use it daily for years, but paying $25/month feels reasonable. (See also how much people complain about paying $60 for video games which they play for thousands of hours!)
It's sad because the dynamics and incentives around clear, up-front prices seem generally better than SaaS (more user control, less lock-in), but almost all commercial software morphs into SaaS thanks to a mix of psychology, culture and market dynamics.
There are other advantages to having your software and data managed by somebody else, but they are far less determinative than structural and pricing factors. In a slightly different world, it's not hard to imagine relatively expensive software up-front that comes with a smaller, optional (perhaps even third-party!) subscription service for data storage and syncing. It's a shame that we do not live in that world.
The root cause of the problem is that it's easier to make personalized stuff with server/backend (?cloud?) than without maybe?
Example: I made a firefox extension that automatically fills forms using LLM. It's fully offline (except OPTIONALLY) the LLM part, optionally because it also supports Ollama locally.
Now the issue is that it's way too hard for most people to use: find the LLM to run, acquire it somehow (pay to run it online or download it to run in Ollama) gotta configure your API url, enter API key, save all of your details for form fulling locally in text files which you then have to backup and synchronize to other devices yourself.
The alternative would be: create account, give money, enter details and all is synced and backedup automatically accross devices, online LLM pre-selected and configured. Ready to go. No messing around with Ollama or openrouter, just go.
I don't know how to solve it in a local way that would be as user friendly as the subscription way would be.
Now things like cars and washing machines are a different story :p
Yeah Dropbox Apple etc. provide enough free or paid storage that shows you the true cost. Circa $10 for 2Tb. Cloudflare let's you host static files pretty much for free. Or cost is rounding error.
So you can run 1000 local first app that syncs to a Dropbox for that 10/m in storage. And that storage is full B2C level ready to go not some low level s3 like primitive. Has auth, has supported has programs to sync.
Anybody opposing the "Stop Killing Games" initiative should read this comment.
Nobody is forcing anybody to make their games rely solely on online services. It's not a legal requirement, regulatory requirement, or anything else. It is a choice, like most things in software. To make the choice to rely on online services and then say "we'll have to spend money later to unfuck this!" is honestly short sided, pathetic, and nobody should accept it.
This was refreshing to read! More apps should be local-first. If the user does not want to sync their data to cloud, they should have that option.
I’ve been building the offline-first (or local-first) app Brisqi[0] for a while now, it was designed from the ground up with the offline-first philosophy.
In my view, a local-first app is designed to function completely offline for an indefinite period. The local experience is the foundation, not a fallback and cloud syncing should be a secondary enhancement, not a requirement.
I also don’t consider apps that rely on temporary cache to be offline-first. A true offline-first app should use a local database to persist data. Many apps labeled as “offline-first” are actually just offline-tolerant, they offer limited offline functionality but ultimately depend on reconnecting to the internet.
Building an offline-first app is certainly more challenging than creating an online-only web app. The syncing mechanism must be reliable enough to handle transitions between offline and online states, ensuring that data syncs to the cloud consistently and without loss. I’ve written more about how I approached this in my blog post[1].
How has it been going? I've been thinking of trying this model but a bit worried about how much harder it would be to make it sustainable as a business
If I were to do it all over again, I’d simplify things even further. I’d start by testing the idea with just the offline version. For syncing, I’d focus on syncing the data itself, just the database records and not the user actions. Instead of replaying every action, I’d track which records were modified and sync only those. No need to capture what changed, just that record got changed. Hopefully that makes sense.
The business is doing alright, but what really keeps me going is that the app is something I genuinely wanted for myself. I use it all the time, it's always open in the background, so that keeps me motivated.
Cool to see principles behind this, although I think it’s definitely geared towards the consumer space. Shameless self plug, but related: we’re doing this for industrial assets/industrial data currently (www.sentineldevices.com), where the entire training, analysis and decision-making process happens on customer equipment. We don’t even have any servers they can send data to, our model is explicitly geared on everything happening on-device (so the network principle the article discussed I found really interesting). This is to support use cases in SCADA/industrial automation where you just can’t bring data to the outside world. There’s imo a huge customer base and set of use cases that are just casually ignored by data/AI companies because actually providing a service where the customer/user is is too hard, and they’d prefer to have the data come to them while keeping vendor lock-in. The funny part is, in discussions with customers we actually have to lean in and be very clear on “no this is local, there’s no external connectivity” piece, because they really don’t hear that anywhere and sometimes we have to walk them through it step by step to help them understand that everything is happening locally. It also tends to break the brains of software vendors. I hope local-first software starts taking hold more in the consumer space so we can see people start getting used to it in the industrial space.
It doesn't help that all the SCADA vendors are jumping on the cloud wagon and trying to push us all in that direction. "Run your factory from your smartphone!" Great, now I'm one zero-day away from some script kiddie playing around with my pumps.
An exciting space and I'm glad you and your team are working in it.
I looked over your careers page and see all of your positions are non-remote.
Is this because of limitations of working on local-first software require you to be in-person? Or is this primarily a management issue?
Personally, I disagree with this approach. This is trying to solve a business problem (I can't trust cloud-providers) with a technical trade-off (avoid centralized architecture).
The problems with closed-source software (lack of control, lack of reliability) were solved with a new business model: open source development, which came with new licenses and new ways of getting revenue (maintenance contracts instead of license fees).
In the same way, we need a business model solution to cloud-vendor ills.
Imagine we create standard contracts/licenses that define rights so that users can be confident of their relationship with cloud-vendors. Over time, maybe users would only deal with vendors that had these licenses. The rights would be something like:
* End-of-life contracts: cloud-vendors should contractually spell out what happens if they can't afford to keep the servers running.
* Data portability guarantees: Vendors must spell out how data gets migrated out, and all formats must be either open or (at minimum) fully documented.
* Data privacy transparency: Vendors must track/audit all data access and report to the user who/what read their data and when.
I'm sure you can think of a dozen other clauses.
The tricky part is, of course, adoption. What's in it for the cloud-vendors? Why would they adopt this? The major fear of cloud-vendors is, I think, churn. If you're paying lots of money to get people to try your service, you have to make sure they don't churn out, or you'll lose money. Maybe these contracts come only with annual subscription terms. Or maybe the appeal of these contracts is enough for vendors to charge more.
> This is trying to solve a business problem (I can't trust cloud-providers) with a technical trade-off (avoid centralized architecture).
Whenever it's possible to solve a business problem or political problem with a technical solution, that's usually a strong approach, because those problems are caused by an adversarial entity and the technical solution is to eliminate the adversarial entity's ability to defect.
Encryption is a great example of this if you are going to use a cloud service. Trying to protect your data with privacy policies and bureaucratic rules is a fool's errand because there are too many perverse incentives. The data is valuable, neither the customer nor the government can easily tell if the company is selling it behind their backs, it's also hard to tell if he provider has cheaped out on security until it's too late, etc.
But if it's encrypted on the client device and you can prove with math that the server has no access to the plaintext, you don't have to worry about any of that.
The trouble is sometimes you want the server to process the data and not just store it, and then the technical solution becomes, use your own servers.
I 100% agree, actually. If there were a technical solution, then that's usually a better approach.
For something like data portability--being able to take my data to a different provider--that probably requires a technical solution.
But other problems, like enshittification, can't be solved technically. How do you technically prevent a cloud vendor from changing their pricing?
And you're right that the solution space is constrained by technical limits. If you want to share data with another user, you either need to trust a central authority or use a distributed protocol like blockchain. The former means you need to trust the central provider; the latter means you have to do your own key-management (how much money has been lost by people forgetting the keys to their wallet?)
There is no technical solution that gets you all the benefits of central plus all the benefits of local-first. There will always be trade-offs.
Does this really solve the problem? Let's say I'm using a cloud provider for some service I enjoy. They have documents that spell out that if they have to close their doors they will give X months of notice and allow for a data export. Ok, great. Now they decide to shut their doors and honor those agreements. What am I left with? A giant JSON file that is effectively useless unless I decide to write my own app, or some nice stranger does? The thought is there, it's better than nothing, but it's not as good as having a local app that will keep running, potentially for years or decades, after the company shuts their doors or drops support.
Data portability is, I think, useful even before the service shuts down. If I'm using some Google cloud-service and I can easily move all my data to a competing service, then there will be competition for my business.
What if cloud platforms were more like brokerage firms? I can move my stocks from UBS to Fidelity by filling out a few forms and everything moves (somewhat) seamlessly.
My data should be the same way. I should be able to move all my data out of Google and move it to Microsoft with a few clicks without losing any documents or even my folder hierarchy. [Disclaimer: Maybe this is possible already and I'm just out of the loop. If so, though, extend to all SaaS vendors and all data.]
This would make cloud vendors kind of like banks. The cloud vendor is holding a kind of property for the user in the user's account. The user would have clearly defined rights to that property, and the legal ability to call this property back to themselves from the account.
This calling back might amount to taking delivery. In a banking context, that is where the user takes delivery of whatever money and other property is in the account. In the cloud vendor case, this would be the user receiving a big Zip file with all the contents of the account.
Taking delivery is not always practical and is also not always desirable. Another option in a financial context is transferring accounts from one vendor to another: this can take the form of wiring money or sometimes involves a specialized transfer process. Transferring the account is probably way more useful for many cloud services.
This leads us to a hard thing about these services, though: portability. Say we delineate a clear property interest for user's in their cloud accounts and we delineate all of their rights. We have some good interests and some good rights; but what does it mean to take delivery of your Facebook friends? What does it mean to transfer your Facebook account from one place to another?
> This is trying to solve a business problem (I can't trust cloud-providers) with a technical trade-off (avoid centralized architecture).
I don't think that's quite correct. I think the authors fully acknowledge that the business case for local-first is not complexly solved and is a closely related problem. These issues need both a business and technical solution, and the paper proposes a set of characteristics of what a solution could look like.
It's also incorrect to suggest that local-first is an argument for decentralisation - Martin Kleppmann has explicitly stated that he doesn't think decentralised tech solves these issues in a way that could become mass market. He is a proponent of centralised standardised sync engines that enable the ideals of local-first. See his talk from Local-first conf last year: https://youtu.be/NMq0vncHJvU?si=ilsQqIAncq0sBW95
I'm sure I'm missing a lot, but the paper is proposing CRDTs (Conflict-free Replicated Data Types) as the way to get all seven checkmarks. That is fundamentally a distributed solution, not a centralized one (since you don't need CRDTs if you have a central server).
And while they spend a lot of time on CRDTs as a technical solution, I didn't see any suggestions for business model solutions.
In fact, if we had a business model solution--particularly one where your data is not tied to a specific cloud-vendor--then decentralization would not be needed.
I get that they are trying to solve multiple problems with CDRTs (such a latency and offline support) but in my experience (we did this with Groove in the early 2000s) the trade-offs are too big for average users.
Tech has improved since then, of course, so maybe it will work this time.
> * Data portability guarantees: Vendors must spell out how data gets migrated out, and all formats must be either open or (at minimum) fully documented.
This is not practical for data of any size. Prod migrations to a new database take months or even years if you want things to go smoothly. In a crisis you can do it in weeks but it can be really ugly, That applies even when moving between the same version of open source database, because there's a lot of variation between the cloud services themselves.
The best solution is to have the data in your own environment to begin with and just unplug. It's possible with bring-your-own-cloud management combined with open source.
My company operates a BYOC data product which means I have an economic interest in this approach. On the other hand I've seen it work, so I know it's possible.
I'd love to know more about BYOC. Does that apply to the raw data (e.g., the database lives inside the enterprise) or the entire application stack (e.g., the enterprise is effectively self-hosting the cloud).
It seems like you'd need the latter to truly be immune to cloud-vendor problems. [But I may not understand how it works.]
> Personally, I disagree with this approach. This is trying to solve a business problem (I can't trust cloud-providers)
It is not only a business problem. I stay away from cloud based services not only because of subscription model, but also because I want my data to be safe.
When you send data to a cloud service, and that data is not encrypted locally before being sent to the cloud (a rare feature), it is not a question of if but when that data will be pwned.
"Trust about whether or not another company will maintain confidentiality" still sounds like a business problem to me (or at least one valid way of perceiving the problem)
And the biggest advantage I see of this perspective over the "technical problem" perspective is that assigning responsibility completely covers the problem space, while "hope that some clever math formula can magic the problem away" does not.
Putting stuff in escrow is usually the way to go: escrow service is paid upfront (say, always for the next 3 months), and that's the time you've got to pull out your data.
My company does that with a few small vendors we've got for the source code we depend on.
But that's the point of contracts, right? When a company shuts down, the contracts become part of the liabilities. E.g., if the contract says "you must pay each customer $1000 if we shut down" then the customers become creditors in a bankruptcy proceeding. It doesn't guarantee that they get all (or any) money, but their interests are negotiated by the bankruptcy judge.
Similarly, I can imagine a contract that says, "if the company shuts down, all our software becomes open source." Again, this would be managed by a bankruptcy judge who would mandate a release instead of allowing the creditors to gain the IP.
Another possibility is for the company to create a legal trust that is funded to keep the servers running (at a minimal level) for some specified amount of time.
(cont. thinking...) One possibility. A 3rd party manages a continually updating data escrow. It'd add some expense and complexity to the going concern.
A good contract can help you to seek some restitution if wrongdoing is done and you become aware of it and you can prove it. It won't mechanically prevent the wrongdoing from happening.
It can also help to align the incentives of multiple parties to actually care about the same goals.
"Mechanically preventing wrongdoing from happening" can be a bit of a Shangri-La. What Tech can mechanically do is increase the cost of wrongdoing, or temporarily deflect attempts towards easier targets. But that by definition cannot "solve the problem for everyone" as there will always be a lowest hanging fruit remaining somewhere.
What contracts can do is help to reduce the demand for wrongdoing.
Currently there are laws but not for hosting. Look at the contract of Steam for example or Ubisoft, or anything else - Q: What happens to your game collection if we shut down our servers? A: You own nothing and lose everything, GG!
It is like that we must protect users privacy from greedy websites so we will make the bad ones spell out that they use cookies to spy on users - and the result is what we have now with the banners.
I agree with you! And your point about cookie banners underlines that we can't just rely on regulation (because companies are so good are subverting or outright lobbying their way out of them).
Just as with the open source movement, there needs to be a business model (and don't forget that OSS is a business model, not a technology) that competes with the old way of doing things.
Getting that new business model to work is the hard part, but we did it once with open source and I think we can do it again with cloud infrastructure. But I don't think local-first is the answer--that's just a dead end because normal users will never go with it.
It’s based on NixOS to provide as much as possible out of the box and declaratively: https, SSO, LDAP, backups, ZFS w/ snapshots, etc.
It’s a competitor to cloud hosting because it packages Vaultwarden and Nextcloud to store most of your data. It does provide more services than that though, home assistant for example.
It’s a competitor to YUNoHost but IMO better (or aims to be) because you can use the building blocks provided by SelfHostBlocks to self-host any packages you want. It’s more of a library than a framework.
It’s a competitor to NAS but better because everything is open source.
It still requires the user to be technical but I’m working on removing that caveat. One of my goals is to allow to install it on your hardware without needing nix or touching the command line.
Love it! I've been thinking about this a lot lately. It's crazy how many great FOSS alternatives are out there to everything – and while they might be relatively easy to install for tech-people ("docker compose up"), they are still out of reach for non-tech people.
Also, so many of these selfhostable apps are web applications with a db, server and frontend, but for a lot of use cases (at least for me personally) you just use it on one machine and don't even need a "hosted" version or any kind of sync to another device. A completely local desktop program would suffice. For example I do personal accounting once a month on my computer – no need to have a web app running 24/7 somewhere else. I want to turn on the program, do my work, and then turn it off. While I can achieve that easily as a developer, most of the people can't.
There seems to be a huge misalignment (for lack of a better word) between the amount of high-quality selfhostable FOSS alternatives and the amount of people that can actually use them. I think we need more projects like yours, where the goal is to close that gap.
I will definitely try to use selfhostblocks for a few things and try to contribute, keep it up!
My guess as to why most apps are now a web UI on top of a DB is because it’s easy to “install”. SelfHostBlocks is admittedly geared towards a central server serving web apps. Or at least apps with a desktop or mobile component but geared towards synching to a central server.
Feel free to give it a try though, I’d love that! Also feel free to join the matrix channel UF you have any questions or just to get some updates.
I love that application. I plan to make some improvements to the web UI. I’d love to have multiple tabs with saved reports. That would allow my spouse to use it quite easily. I’ll be adding that at some point.
There is no reason for every application to have its own sync platform. I suspect this framing came out of mobile apps where there is no composability or modularity between programs.
If you really embrace "local first" just use the file system, and the user can choose from many solutions like git, box, etc.
I hate signing up for your sync just as much as any other SAAS, but it's even more opaque and likely to break.
I agree that not every app needs it's own sync engine, but I disagree with your framing that the file system is the universal way to embrace local first. I have two reasons.
First is that yeah, local first, but I also want concurrency. If it's just local first, you're right, any old sync will do. But I want more than that. I want to not have to think (a la dropbox, being slick). I want my wife and I to be able to make separate edits on our phones when we're in a dead zone.
Second is that sync works a lot better when it has deep knowledge of the data structure and semantics. Git and box both have significant shortcomings, but both exacerbated by the concurrency desire.
But this problem isn't going to be solved by every app making its own sync system. Even if there is a magic library you can adopt that does pretty good, then everyone having their own completely independent hosting solution and sync schedule.
If files are insufficient, what data-structure would make modular sync possible for multiple applications in an OS?
And I’m not suggesting one doesn’t exist, I’m challenging to present a comprehensive solution, that probably involved operating systems.
> I want my wife and I to be able to make separate edits on our phones when we're in a dead zone.
If the app is designed for it you can use a hybrid approach, where a given "document" is stored in 1 file for each client, and the client merges the changes across all files. That way there's never a change conflict that something like Dropbox needs to handle and it can all be offloaded to the app.
I mostly agree with this, but sometimes it's not that simple in practice. I created an app that did exactly this and it resulted in inevitable file conflicts because I couldn't negotiate between the clients when a file should be allowed for editing.
In theory, I love the local-first mode of building. It aligns well with “small tech” philosophy where privacy and data ownership are fundamental.
In practice, it’s hard! You’re effectively responsible for building a sync engine, handling conflict resolution, managing schema migration, etc.
This said, tools for local-first software development seem to have improved in the past couple years. I keep my eye on jazz.tools, electric-sql, and Rocicorp’s Zero. Are there others?
Using Couch/Pouch on our current app for this reason. Great to work with. Though we’re not supporting offline-first right away (depends on external services), it’s going to help with resilience and a future escape valve.
I've been using instantdb in anger for the past month or so for a side project of mine. I'm building a personal budget app.
I should probably write a blog post, but I will say that I investigated power sync, electricSQL, livestore and powersync before. I briefly looked at jazz tools but wanted something a bit more structured.
I'm pretty impressed this far. I've actually been writing it with Vue and a community library. Permissions were a bit tricky, but once I figured it out it was simple. And I like their magic email login. And I like their dashboard/reply, but there are a few big changes I would make there to make it less fiddly.
I love that it's open source, and that if I want to, I could self host it.
As for the other options:
- jazz wasn't structured enough
- livestore came off as too fiddly with the event store, but it was appealing. That the dev tools are payealled was disappointing, but understandable
- electriSQL really only provided half a solution (read, not the write model
- couchDB / pouchDB wasn't structured enough for me, and I wanted better cross document support than was obvious / baked in.
+1 for instant! Been using it and I find it a breeze to work with, definitely filling the exact niche this article was discussing. sync engines are the future!
Rust and JavaScript implementations, a handful of network strategies. It doesn't come with the free or paid offering that jazz.tools does, but it's pretty nice.
It’s a local-first platform that supports real-time sync with CRDTs at its core, making conflict resolution much easier to manage. Ditto is designed to handle offline-first use cases and peer-to-peer sync out of the box, so you don’t have to build a custom sync engine from scratch.
It supports a wide range of platforms including Swift, Kotlin (Android), Flutter/Dart, React Native, JavaScript (Web/Node), .NET (C#), C++, Java, and Rust. You can dive deeper into what it offers from the docs site:
https://docs.ditto.live/home/about-ditto
Along with the others mentioned, it's worth highlighting Yjs. It's an incredible CRDT toolkit that enables many of the realtime and async collaborative editing experience you want from local-first software.
I’ve built several apps on yjs and highly recommend it. My only complaint is that storing user data as a CRDT isn’t great for being able to inspect or query the user data server-side (or outside the application). You have to load all the user’s data into memory via the yjs library before you can work with any part of it. There are major benefits to CRDTs but I don’t think this trade-off is worth it for all projects.
I use local software and sync files using git or sometimes fossil (both work fine in Android with termux for instance, for stuff In want to access on my phone). I don't host servers or use any special software that requires syncing data in special ways.
I've always thought that this article overstates the promise of CRDTs with regard to conflict resolution. For toy cases like a TODO list, yes, you can define your operations such that a computer can automatically reconcile conflicts - e.g. you only support "add" and "mark as complete", and if something gets marked as complete twice, that's fine.
But once you get past toy examples, you start wanting to support operations like "edit", and there generally isn't a way to infer the user's intent there. Like, if my cookie recipe starts with 100g of sugar, and I modify it on my phone to use 200g of sugar, and I modify it on my desktop to use 150g of honey instead of 100g of sugar, there are a bunch of ways to reconcile that:
1. Stick with 200g of sugar, drop the 1.5x honey substitution.
2. Stick with 150g of honey, drop the 2x.
3. Merge them - 300g of honey.
4. Merge them - 150g of honey and 50g of sugar.
There's no way for any automated system to infer my intent there. So you've got to either:
1. Ask the user to resolve the conflict. This means you have to build out the whole "resolve this merge conflict for me" UI and the promise of "conflict-free" has not been fulfilled.
2. Arbitrarily choose an option and silently merge. This risks badly surprising the user and losing changes.
3. Arbitrarily choose an option, but expose the fact that you've auto-resolved a conflict and allow the user to manually re-resolve. This requires even more UI work than option 1.
4. Constrain your data model to only allow representing intents that can be deterministically resolved. In practice I think this is too severe of a constraint to allow building anything other than toy apps.
IMO #1 and #3 are the least-bad options, but I don't think they're consistent with the expectations you'd have for CRDTs after reading this article.
Arbitrarily choose an option, but expose the fact that you've auto-resolved a conflict and allow the user to manually re-resolve. This requires even more UI work than option 1.
This is what every "cloud file sharing" provider like Dropbox is doing. If there is a conflict, the version on the server is "the right one", and your locally conflicted file is copied on the side with some annotation in the file name.
Yeah, but Dropbox is sort of playing on easy mode, because the data is "just files" and you can manually resolve the conflict with regular old text editors, etc. If you don't expose your app's data model in the file system (and on a phone you generally wouldn't), that means you need to write something custom to resolve the conflicts.
"it is desirable for the software to run as a locally installed executable on your device, rather than a tab in a web browser."
An OS agnostic apps, meaning web apps is such a killer feature. You can use the same app on: Linux, Mac OS, Windows, Android, iOS
Even more, developing for web is typically faster. You made the change in the code => you see the result on the screen. For example: phone apps written in swift could be faster than ones written in react-native, but it is so annoying waiting for the compilation to finish after every small change.
----
When I worked on imagery data in an autonomous vehicles company product managers pushed us to explore the data in the cloud and it was soooo inconvenient.
As the result, PMs were ignored and everyone had a personal desktop with GPUs and fast SSDs that had the local copy of the data, so that debugging, prototyping would be fast.
As lag that one gets working with a heavy data remotely reminded moving back from SSD to slow HDD, where you needed to wait some time to see the result on the screen.
It was only half a second every time, but felt ultra annoying.
Self hosting (which is often adjacent to local-first software) is fine. I've done it for years.
But it is a nightmare when it goes wrong: the conclusion I've reached is that it is out of reach to regular people who don't want the Byzantine support load that could accompany something going wrong. They want turnkey. They want simple. They aren't interested in operating services, they're interested in using them.
The FLOSS model of self hosting doesn't really offer a reliable way of getting this: most businesses operating this way are undercapitalised and have little hope of ever being any other way. Many are just hobbies. There are a few exceptions, but they're rare and fundamentally the possibility of needing support still exists.
What is needed, imo, is to leverage the power of centralised, professional operations and development, but to govern it democratically. This means cooperatives where users are active participants in governance alongside employees.
I've done a little work towards this myself, in the form of a not-yet-seen-the-light-of-day project.
What I'd love to see is a set of developers and operators actually getting paid for their work and users getting a better deal in terms of cost, service, and privacy, on their own (aggregate) terms. Honestly, I'd love to be one of them.
Does anyone think this has legs to the same extent as local-first or self hosting? Curious to know people's responses.
This is the business model I want to have: I work on a stack of fully open source software and package them in a turn-key server that you own. You can use it on your own for free if you’re knowledgeable and I offer a subscription where I’m the sysadmin of the box you own and that I built for you. I do the maintenance, the updates, etc. There’s no lock-in because you can stop the subscription anytime or even just pick another sysadmin that would know the stack. The only reason you’d keep me around would be that the service I offer is pretty damn good. Would something like that appeal to you?
I was about to suggest that a better, more open, and fair form of capitalism would need to be used as a tool...but then, re-reading your comment - "...leverage the power of centralised, professional operations and development, but to govern it democratically..." - i think you better encapsulate what i meant to convey. :-)
That being said, yes, i do believe *in the near/upcoming future* local-first, self-hosting and i will add more fair open source vendors will work! Well, at least, i hope so! I say that because Europe's recent desire to pivot away from the big U.s. tech companies, and towards more digital sovereignty - in my opinion - begins the foundational dependency for an ecosystem that will/could sustain self hosting, etc. The more that europe is able to pivot away from big tech, the more possibilty exists for more and varied non-big tech vendors manifest...and the more that Europe adopts open source, the more the possibility that usage and expertise of self-hosting grows....plus, for those who do not know how to, or simply do not wish to manage services themselves...well, in time i think Europe will have fostered a vast array of vendors who can provide such open source, digital services, but get paid a fair cost for providing fair value/services, etc. ...and, by the way, i say this all as a biased person in favor of open source AS WELL AS being an American. :-)
> What is needed, imo, is to leverage the power of centralised, professional operations and development, but to govern it democratically. This means cooperatives where users are active participants in governance alongside employees.
Utopia. Unattainable. Self-determination of the individual has been consistently persecuted under all societal arrangements; communism and capitalism equally hate a citizen that wants to remain independent and self-sufficient.
We need a term for a viable business model to pair with local-first tech.
I've been working on Relay [0] (realtime multiplayer for Obsidian) and we're trying to follow tailscale's approach by separating out the compute/document sync from our auth control plane.
This means thats users still subscribe to our service (and help fund development) and do authn/authz through our service, but we can keep their data entirely private (we can't access it).
Are you requiring a google account for file/folder based auth on a per user bases for a vault? Not to keen on using a 3rd party for this kind of thing.
For our free/individual plan we do use OAuth2 providers (currently only Google is enabled, but considering others), and can support other methods for larger teams (like oidc).
Originally the idea was to keep everything within the Obsidian UI so things like username/password didn't make sense (no password managers in Obsidian).
We initiate the OAuth2 login flow from within Obsidian. I guess we could add an extra click that takes you to our website first and then add support more auth methods from there. I don't really want it to feel like a web app though.
I'd love to hear your take. Which login method do you think is both simple and could be coherently used within Obsidian on all platforms?
I've been building exactly this with SoundLeaf [0] - an iOS client for the excellent open-source Audiobookshelf server. No data collection, no third-party servers, just your audiobooks syncing directly with your own instance.
The user-friendliness challenge is real though. Setting up Audiobookshelf [1] is more work than "just sign up," but once you have it running, the local-first client becomes much cleaner to build. No user accounts, no subscription billing, no scaling concerns.
Simple pricing too: buy once, own forever. No monthly fees to access your own audiobooks.
The primary challenge with building local first software is the sync layer. The current 3rd party offerings are not mature. And people have been working on these for a few years. Electric SQL comes to mind.
I'm curious on what you experienced that caused you to come to the conclusion that 3rd party sync solutions are not mature. There are several 3rd party vendors like Ditto that have been building local first sync solutions that are used by very large companies with success.
I've been wanting a computing model I call PAO [1] for a long time. PAO would run personal application "servers" and connect dynamic clients across all devices. PAO is centralized, but centralized per user, and operating at their discretion. It avoids synchronization, complex concurrent data structures, and many other problems associated with alternatives. Its weakness is a need for always-on networks, but that complication seems ever easier to accept as omnipresent networks become realistic.
It's a very exciting moment for this movement. A lot of the research and tech for local-first is nearing the point that it's mature, efficient, and packaged into well designed APIs.
Moreover, local-first —at least in theory— enables less infrastructure, which could reignite new indie open source software with less vendor lock-in.
However, despite all my excitement about embracing these ideas in the pursuit of better software, there's one hurdle that preventing more wide spread adoption amongst developers, and that is the Web platform.
The Web platform lacks building blocks for distributing hashed and/or signed software that isn't tied to origins. In other words, it's hard to decouple web-apps from the same-origin model which requires you set up a domain and serve requests dynamically.
Service Workers and PWAs do help a bit in terms of building offline experiences, but if you want users to download once, and upgrade when they want (and internet is available), you can't use the Web. So you end up breaking out of the browser, and start using Web technologies outside of the browser with better OS functionality, like Electron, React Native, Tauri et al (the https://userandagents.com/ community is doing some cool experiments in this space).
We need to get back to apps rather than webapps. The hardware compatibility issues of the past are basically no longer here, and there are three major OS types two of which can use each other's apps.
I recently started using Typst instead of Pandoc->LaTeX.
I held off on playing with Typst for years because I was under the (incorrect) impression that the only way to use it was with their web editor. I'm sure that their editor is completely fine, but I am pretty entrenched in Neovim and Pandoc had been serving me well.
Once I found out that Typst has a command line version that I can use directly, it became more appealing, because I'm pretty sick of cloud shit.
The data part aside, and specifically on the platform/functionality side - these cloud/large products unfortunately do offer more powerful/advanced features, or convenience. Be it cloud multi-device functionality that makes moving around and collaborating seamless, or to enterprise products like snowflake and fabric that offers all sorts over a standard mssql db.
I'm personally very against vendor lock in, but there is some value to them.
> "we have gone further than other projects down the path towards production-ready local-first applications based on CRDTs"
This seems like a bold claim, but IMHO Ink & Switch have earned their solid reputation and it wouldn't surprise me if it's true. I agree w/ their analysis and am philosophically aligned w/ their user-centric worldview. So who's going to build "Firebase for CRDTs"?
I was just referring to the posted article's assertion that "Firebase for CRDTs" is a huge opportunity. I think I agree w the authors that a well-architected CRDT solution for local-first apps requires capabilities not currently provided by Firebase or any other vendor. But I'm no expert.
Complete agreement. Here's a brief, practical action plan for Windows users:
* Download all your data from Microsoft's "OneDrive" cloud storage, which if not disabled, is the default storage method in a new Windows install.
* Verify that all your files are now stored locally.
* Click the gear icon, go to "Settings -> "Account" -> "Unlink this PC," right-click, "Unlink account".
* Remove Microsoft's OneDrive app from your system -- full removal is the only way to prevent perpetual harassment and reactivation. Go to "Apps" -> "Apps & features" (or "Installed apps" on Windows 11) -> "Microsoft OneDrive", right-click, "Uninstall."
* Optional extra step: cancel your Microsoft 365 subscription and install LibreOffice (free, open-source).
Remember this -- cloud storage only has advantages for Microsoft and law enforcement (which have a number of easy ways to gain access to your documents compared to local storage). For a Windows user, cloud storage is the ultimate Dark Pattern.
This reminds me of my own painful story: I once made a local photo search app called Queryable that ported OpenAI's CLIP model to iPhone, letting you search your photos with queries like "a black cat sitting on a sofa."
Since it needed to access users' local photo libraries, I didn't want the app to connect to the internet under any circumstances. So I made it a paid app instead of the usual free+in-app purchases model, since the latter requires calling StoreKit which goes online. But because the app had to run the CLIP model, it would crash on lower-performance phones like the iPhone X. Users who paid for it couldn't use it and felt scammed, leading to tons of one-star reviews and angry complaints about their photos being stolen. Eventually I decided to open-source the app, though it never brought me much revenue anyway.
Two years later, Apple started announcing they'd be integrating this exact feature into Apple Intelligence : )
Apple doesn't allow developers to target specific device models, presumably to prevent discrimination. However, you have two options: 1. Set a minimum iOS version requirement, or 2. Restrict to devices with A12 chips or later. But neither approach can exclude certain problematic device models.
What are the top web local first frameworks worth checking out these days? i’ve heard of livestore, tanstack DB with electric, zero. any others that are easy to use and flexible? use case is multiplayer apps and maybe games. thanks!
The old model—a one-time purchase, local install, full user control—worked because devs could sell boxed software at scale. Now, that model collapses unless someone’s willing to either Undervalue their own labour, or treat the software like a public good, absorbing the long tail of maintenance with no recurring income.
The article posits it as though subscription software is something which has been sneaked in on us. But users today expect things like instant updates, sync across devices, collaboration, and constant bug fixes and patches - none of which come easily if you're only willing to pay for the system once.
> as though subscription software is something which has been sneaked in on us
Oh but it has (IMO).
> users today expect things like instant updates [...] constant bug fixes and patches
Nah, this is in reverse. With boxed software, the developer had to deliver an essentially bug-free product. Now, with easy updates technically possible, the developers have gone complacent, and deliver shit. That is why users expect bugfixes instantly. (And any enlightened user abhors unrequested features, as there are no features without regressions, and who wants regressions in any serious application?) The only tolerable online updates are security fixes.
> sync across devices, collaboration
This is a valid expectation, but its execution has been a train-wreck. Research, design and implementation should start with end-to-end encryption; the network architecture should be peer-to-peer (mesh, not centralized). What do we get instead? More centralization of control than ever, and less privacy and ownership than ever.
Generally that's not how I remember it - third party software on the Mac at least got some kind of a beach-head because Windows software was full of bugs, crashes, corrupted files, drivers that never worked, and patch CDs mailed to enterprise customers like they were firmware apologies. Own your own software, taken to its logical endpoint, was a shareware nightmare.
The old model of boxed updates is still in use by some companies today, JetBrains comes to mind. In either case you tuck major new features in a new major version or rolling yearly releases and sell the customer a license to the software that gets a year of updates. In a similar vein many apps I use on my Mac have monthly subscriptions but cancelling them limits their use to essentially one device, but doesn't remove the application or my access to the data.
We have been building a local-first browser app (PWA) for personal finance, based on double-entry accounting. https://finbodhi.com/
We do use online services like firebase for auth, and some service to fetch commodity prices etc, but rest of the data is stored in browser storage (sqlite) and backed to local disk (and soon dropbox). We also syncs data across devices, always encrypting data in transit.
I think it's the way to go, for most personal data applications.
Try demo doesn't work on my iphone, it keeps spinning forever. Plus please take into consideration of removing the friction via signup, if it's real local first you don't need accounts in the cloud
It might be that you’re trying it in Safari’s private mode. It works in regular mode, but private mode blocks storage, so the app can’t function there. We should improve the error message to make that clearer.
That said, the app is currently designed for desktop use—mobile UX is still on our roadmap.
As for signup: it helps us track and bill users. Building our own auth system for local-first would’ve been a full project on its own. Until better options exist for authorization and billing in local-first apps, we’ll stick with a cloud signup flow.
Remember when the justification for cloud was "Your work is not trapped on one device". Well, turns out your cloud data is trapped on one device, AND it's not under your control.
I've made a local first, end-to-end encrypted, auto sync bookmark extension that doesn't milk your data in any way. It's 100% private, I even don't use Google analytics on my website. Some of the reasons why I've put some work into this is:
- because I could not find something similar that doesn't milk and own my data
- to never lose a bookmark again
- to have my bookmark data encrypted in the cloud
- to have private history
- to have some extra time saving features in the extension that are for unknown reason rare to find
- more learning and experience (it's acutally quite complex to build this)
After about 4 years of using it daily on every pc I own, I found out it's a pain for me and my family when it is not installed on a browser. I thought; if it's useful for us, it might be useful for others too! So, I decided to make it available by subscription for a small fee to cover the server and other costs. I'm not really into marketing, so almost no one knows it exists. You can find it on markbook.io.
I feel like local-first or offline-first can be seen as something catered to niche users when it's brought up in front of strategy or planning folks — they imagine most people having good, reliable internet all the time. The truth is always more frustrating to account for. It is extremely frustrating to see the falloff in functionality in apps when internet is spotty or slow, which happens a lot. Try doing anything in most apps on the subway in New York (where there isn't service in most tunnels) and you'll feel the pain. Or, try doing anything in a crowd and the cell towers are saturated. Fastmail's client won't show you emails without internet, Balatro hangs while it looks for a cloud save, the list goes on and on.
Regarding the no-spinners: I think it is the wrong approach to argue that just because you have data locally, you don't need any spinners.
Whether you need a spinner or not should be decided by the User Experience (e.g., when the user has to wait for more than 100ms, show a spinner), and not by the location of the data. I am a big fan of local-first apps and enjoy building them myself. However, sometimes your app takes a moment to load. With local-first, you eliminate the network as a source of delays, but there are other factors as well, such as large data sets or complex algorithms.
For example, when you have a project planning software and want to plan 100 work packages with multiple resource combinations in an optimal way, depending on the algorithm, this can take some time. In that case, a spinner or a progress bar is a good thing.
I didn’t get the impression that the author is advocating for removing spinners as a UI concept, rather it’s just being used a shorthand for, “you should not need to send and load the data to and from elsewhere while you are working.”
A properly designed app would leverage multi threading to place any long running jobs in the background, allowing the user to carry on with other tasks.
You are aware that 'local-first' does not mean 'no-network'. Having a sync mechanism that runs in the background without user notification can be quite disconcerting.
I mean, I did it, I built an app with a transparent background sync. Then I added a special page, 'sync center'.
In reality, mobile devices don't always have perfect network connections. Therefore, when the user is unsure whether the device is in sync or if the sync is in progress but encounters an obstacle, they might perceive the app as unreliable.
Skimming the article, it seems to touch on a lot of the right points, but the motivating first paragraph seems weak:
> Cloud apps like Google Docs and Trello are popular because they enable real-time collaboration with colleagues, and they make it easy for us to access our work from all of our devices. However, by centralizing data storage on servers, cloud apps also take away ownership and agency from users. If a service shuts down, the software stops functioning, and data created with that software is lost.
"Apple pie might be tasty and nutritious and exactly what you want, but, theoretically, apple pie could burst into flames someday, and take your favorite pie-eating bib with it.
> Local-first apps, on the other hand, have better privacy and security built in at the core.
I love this article, but the section on security raised a lot of questions. What's the model for authorizing access to documents for collaboration? How do you managed keys safely for encrypted data? How do users recover "lost" keys?
Cloud computing models have a lot of security mechanisms built-in. You might not like the model (AWS IAM for example) but at least there's a foundation already in place.
Shamir's Secret sharing allows one to store secret keys within m of n number of your peer group (where m is less than n), thus you can hand out an encrypted secret with 5 of your friends and reproduce the secret by getting 3 of them to respond back to you with the secret at a later date. None of the peers will be able to reproduce the secret by themselves.
There are other options for key storage, revoking group privileges, etc. It's an extensive topic, but the foundation is there, it just depends on your network and use cases.
Goal #2, your data is not trapped in a single device is the hard bit, especially with goal #3, the network is optional. For #2 to be true, this means the network is *not* optional for the developer, it is required. Thus the entire complexity of building a distributed app, especially one without a centralized server, which is particularly difficult even with modern local first database tools, greatly increases the complexity of writing this type of software compared to either traditional desktop apps or cloud apps.
Tried to adopt this last month at work, it failed.
E.g. the mentioned Automerge, it has poor docs https://automerge.org/docs/reference/library_initialization/... and that left out a lot of question, it seems backend agnostic but have to figure out how to store, how to broadcast ourselves.
100% agree! I built Paisley (because it is the opposite of plaid), to host your personal finances locally and is 100% open source. Paisley pulls data from your financial institutions by scraping balances and importing CSV exports, storing everything locally in a simple SQLite database.
Awesome to see this getting more coverage. I am very interested in local first and I am working on several progressive web apps based around this. One app depends on file sync, not database sync and the best I have found is remoteStorage.js. Its not perfect, but its very much the missing piece I was often looking for.
I love this idea of local-first software, but from a business point of view there's unfortunately no current incentive to adopt it since it's nowhere near as profitable compared to SaaS. That, in my opinion, is the biggest bottleneck right now to this getting worldwide adoption
Nextcloud with a few addons - all open source - gets you feature parity with all of that lot.
NC itself gets you file sync and webdav etc. An add on gets you the webby version of LibreOffice. You can bolt on AI addons to classify and tag your images/photos and with a bit more effort, your docs too.
In a world of owning nothing and paying subscriptions for everything, owning your data and using software that is either yours or libre is 'rebellion' to many a service provider.
Its not local-first or some sort of cloud diet trend, it should be the norm.
Right. I don't even understand why this article had to be this verbose. It's not like we need to be "convinced" that local is better. Everybody who values privacy and independence knows already. But this stuff is unimplementable -- we suffer from the cloud disease because it's immensely profitable for the cloud providers and cloud-based app providers to enslave us, and to bleed us out. Their whole point is locking us in.
"Sharing models" are totally irrelevant until the concept of self-determination is tolerated by the powerful (and they will never tolerate it). Accessing my data from multiple devices is totally secondary; I don't trust mobile endpoints to access my remote data in the first place.
How about redundancy in general. Not local first, not cloud first, but "anything can be first and last". That's how the "cloud" works in the first place. Redundancy. Mesh networks as well.
People's personal computers, even their tablets and phones are so powerful, they can fulfill most use cases (except AI), especially if the application is reasonably efficient.
One thing I’m personally excited about is the democratization of software via LLMs.
Unfortunately, if you go to ChatGPT and ask it to build a website/app, it immediately points the unknowing user towards a bunch of cloud-based tools like Fly.io, Firebase, Supabase, etc.
Getting a user to install a local DB and a service to run their app (god forbid, updating said service), is a challenge that’s complex, even for developers (hence the prevalence of containers).
It will take some time (i.e. pre-training runs), but this is a future I believe is worth fighting for.
> Unfortunately, if you go to ChatGPT and ask it to build a website/app, it immediately points the unknowing user towards a bunch of cloud-based tools like Fly.io, Firebase, Supabase, etc.
Not sure where your experience is coming from but when I asked an LLM, Claude to be more precise, it referred me to local options first, such as SQLite. It didn't consider cloud platforms at all until I had asked, presumably because it can understand local code and data (it can query it directly and get back results) but cannot understand the context of what's in the cloud unless you configure it properly and give it the env variables to query said data.
In my experience it’s great at utilizing local storage and SQLite, if you ask it to.
I just asked the ChatGPT web client (4o, as that’s what most non-developers might default to):
> Can you build me a website for my photos
And it immediately started suggesting Wordpress, Wix, Squarespace, etc.
Specifically, this was section 4 of the various questions it asked me:
> 4. Tech Preference (optional)
> - Do you want this as a static HTML site, WordPress, or built with something like React, Next.js, or Wix/Squarespace?
> - Do you need help hosting it (e.g., using Netlify, Vercel, or shared hosting)?
As a non-programmer, I likely wouldn’t understand half those words, and the section is marked optional.
If I follow the “default path” I’m quickly forking over a credit card and uploading my pictures of dogs/family/rocks to the cloud.
Local LLMs are even more amazing in concept, all of the world's knowledge and someone to guide you through learning it without needing anything but electricity (and a hilariously expensive inference rig) to run it.
I would be surprised if in a decade we won't have local models that are an order of magnitude better than current cloud offerings while being smaller and faster, and affordable ASICs to run them. That'll be the first real challenger to the internet's current position as "the" place for everything. The more the web gets enshittified and commercialized and ad-ridden, the more people will flock to this sort of option.
The speed alone is sufficient for a local-first approach. The latency of any cloud software I’ve ever used is like constant sand in the gears of thinking. Although taking supplements that slow my thinking—essentially natural downers—do improve my experience with such software, the improved experience comes at the expense of IQ. Basically, you need to be a little slow and dumb for the software to work as intended.
This is nuts. Computers are supposed to enhance and enable thinking, not make you stupid. In this sense, cloud software is possibly the biggest fraud ever perpetrated on the paying, computer-using public.
For the love of God, please bring back my late 1990s and early 2000s brain-boosting computer experience.
I mean if local first isn’t possible. I’m more comfortable with web based development plus don’t you need to apply for all kinds of certificates to be “allowed” to run on windows and Mac these days?
Databases like Postgres can be run locally or as part of some kind of managed service in the cloud. Anyone know of recent stats that show the percentage of databases that are managed locally vs by some cloud service?
"Local first" is neither equivalent to privacy protection or public software good. Many businesses sell local-first software that still contains remote backdoors[0] you cannot control. And it most certainly doesn't ensure "public software good" when there is zero obligation to improve the upstream or empower users to seek alternatives.
I would sooner trust a GPL-licensed remote software program than store a kilobyte of personally identifying information in a proprietary "local first" system.
I think you mean antithetical to corrupted conflict-of-interest capitalism.
Conflict-of-interest transactions have hidden or coercive impact, lined up in favor of the party with stronger leverage. Examples include un-asked and unwanted surveillance of data or activity, coercive use of information, vendor lock in, unwanted artificial product/service dependencies, insertion of unwanted interaction (ads), ...
None of that is inherent to capitalism. They clearly violate the spirit of capitalism, free trade, etc.
It is providers taking advantage of customer lack of leverage and knowledge to extract value that does not reflect the plain transaction actually desired by customers. Done legally but often with surreptitious disclosure or dark pattern permissions, border line legally where customers would incur great costs identify and protest, or plain old illegally but in a hidden manner with a massive legal budget to provide a moat against accountability.
It is tragic that the current generation of Silicon Valley and VC firms have embraced conflict of interest based business models. Due to the amounts of money that scaling "small" conflicts can make. Despite the great damage that we now know scaling up "small" conflicts can do.
The problem with our current system of capitalism is that it causes capitalism to accumulate. This leads to less competition, fewer checks and balances, and undermines the whole "wisdom of the crowd" mechanism that captialism is premised on.
If we want a functioning market based system then we need to explicitly correct for this by aggressively taxing the wealthiest entities (individuals and companies) in our society to bring things closer to a level playing field.
"corrupted conflict-of-interest capitalism" is just capitalism.
Free trade is antithetical to capitalism. Free trade means everyone is on a level playing field, but capitalism means those with more capital are above the rest. These are obviously not compatible.
Yes a thousand percent! I'm working on this too. I'm sick of everyone trying to come up with a use case to get all my data in everyone's cloud so I have to pay a subscription fee to just make things work. I'm working on a fitness tracking app right now that will use the sublime model - just buy it, get updates for X years, sync with all your devices and use it forever. If you want updates after X years buy the newest version again. If its good enough as is - and that's the goal - just keep using it forever.
This is the model I want from 90% of the software out there, just give me a reasonable price to buy it, make the product good, and don't marry it to the cloud so much that its unusable w/out it.
There are also a lot of added benefits to this model in general beyond the data privacy (most are mentioned in the article), but not all the problems are solved here. This is a big space that still needs a lot of tooling to make things really easy going but the tech to do it is there.
Finally, the best part (IMHO) about local-first software is it brings back a much healthier incentive structure - you're not monetizing via ads or tracking users or maxing "engagement" - you're just building a product and getting paid for how good it is. To me it feels like its software that actually serves the user.
Obsidian the note taking app is a great model to follow as well. The client is completely free and they sell an optional syncing service. The notes are all on markdown files so the client is completely optional.
This is the reason I have always refused to use Bear note taking app irrespective of how good and snappy that app is. Because they keep their notes in a SQLite db now and even though that file can be backed up and handled locally my notes are not easily accessible to me. I can't easily edit my notes in other editors (which I often like to do on my mac), I can't version controlled backup and sync those files the way I want outside of iCloud (which is what Bear uses).
What is sad is that they used to be local files first note app and then they moved to sqlite citing some sync and performance issues.
15 replies →
the syncing is just really godawfully slow. so much so that after 2 years of use i have almost stopped taking notes.
How do you plan to do the syncing without some sort of cloud infrastructure?
The benefit of local-first means you’re not incentivized to sell your cloud offering, so you can just give options. Sync with iCloud, Google drive, OneDrive, Dropbox, Mega, SMB, SFTP, FTP, whatever you feel like adding support for. And since local-first usually means having some kind of sane file format, you can let “advanced” users manage their own files and synchronization like people have been doing for the last 50 years.
There are a lot of valid answers to this! One is to use your platform's provided one, like OneDrive or iCloud. Another is to integrate with some other sync platform. Dropbox is a popular target for this. Peer-to-peer is another, although that obviously also come with limitations. Finally, bring-your-own-sync is a popular choice amongst open-source apps, where you provide a self-hostable sync server.
1 reply →
Check out Aardvark (renamed to reflection) it's a collaborative note-taking app from the GNOME folks. I think the idea isn't to completely remove cloud infrastructure, but to at least make it optional and/or provide alternatives. For example, this note app works via P2P. blogs.gnome.org/tbernard/2025/06/30/aardvark-summer-2025-update/
For Joplin I use WebDav from the 10gb of free file storage that comes with Fastmail. So I have easy sync with multiple platforms and form factors, and even substantial notes make little dent in the allowance.
Something like Syncthing, perhaps?
2 replies →
Ideally, you would use existing commodity infrastructure but we have found none of it is really super fit for our purposes. Failing that, we have been developing an approach to low-maintenance reusable infrastructure. For now, I would advise running your own but positioning yourself to take advantage of commodity systems as they emerge.
Syncthing
There's a git plugin.
You can use FTP and SVN.
1 reply →
right now its in webrtc
[dead]
> get updates for X years, sync with all your devices and use it forever. If you want updates after X years buy the newest version again. If its good enough as is - and that's the goal - just keep using it forever.
While this sounds good deal, with this approach
- You have to charge total cost of subscription at once (1y or 2y),
- Still have to keep servers running for syncing, also you have think about cases where user syncing 1y of data in a single day.
- Have to keep people on the payroll for future developments.
(You are here thinking only in developer perspective.)
You don't have to keep servers running if there aren't servers (p2p) or you offload it onto some other cloud.
Had similar thoughts a few years back (https://rodyne.com/?p=1439) when considering worst case scenarios after a local factory lost two days production due to a server failure at an IT supplier.
A backend can be part of the functionality though, such as for real-time collaboration and syncing. But you can have ownership and longevity guarantees for both the data and the service as long as you can eject [1] from the cloud and switch to self-host or back at any time, which is what we do for our notes/tasks IDE
[1] https://thymer.com/local-first-ejectable
Totally agree. If you don't mind - what tech stack are you using for your fitness tracking app? I'm particularly curious about how you handle cross-device sync :)
[flagged]
21 replies →
What if you are an old man and more clouds than ever are appearing which deserve a good fist shaking?
Asking for a friend . . .
>you're not monetizing via ads
Yes, you are. You can find tons of purely local apps that monetize themselves with ads.
Sure you could. I'm not, I don't think its in the spirit of local first. And I wouldn't pay money for that, but if you or someone else wants to build that kind of software - its a free world :)
3 replies →
> You can find tons of purely local apps tha[t] monetize themselves with a[d]s.
How do they do that without hitting the internet?
7 replies →
Bro who wants your pointless fitness data? Not even you care that much for that. Just use a notepad ffs.
Fitness data tells a lot, your health status, your daily schedule, with running/cycling/... your exact whereabouts that is quite some valuable information.
A notepad also isn't enough to correlate heart rate etc to specific exercises and plotting over time
4 replies →
> I'm sick of everyone trying to come up with a use case to get all my data in everyone's cloud so I have to pay a subscription fee to just make things work.
AI photo and video generation is impractical to run locally.
ComfyUI and Flux exist, but they serve a tiny sliver of the market with very expensive gamer GPUs. And if you wanted to cater to that market, you'd have to support dozens of different SKUs and deal with Python dependency hell. And even then, proficient ComfyUI users are spending hours experimenting and waiting for renders - it's really only a tool for niche artists with extreme patience, such as the ones who build shows for the Las Vegas Sphere. Not your average graphics designers and filmmakers.
I've been wanting local apps and local compute for a long time, but AI at the edge is just so immature and underpowered that we might see the next category of apps only being available via the cloud. And I suspect that these apps will start taking over and dominating much of software, especially if they save time.
Previously I'd only want to edit photos and videos locally, but the cloud offerings are just too powerful. Local cannot seriously compete.
But who said anything about AI? Lots of local-first apps have nor need any AI whatsoever. And by the way, Topaz Labs has good offerings for editing photos and videos with AI that run locally, works great for many use cases (although it's not fully generative like Veo etc, more like upscaling and denoising, which does use generative AI but not like the former).
6 replies →
> AI photo and video generation is impractical to run locally.
You think it always will be? What can the new iPhone chips do locally?
2 replies →
There is now a great annual Local-first Software conference in Berlin (https://www.localfirstconf.com/) organised by Ink and Switch, and it's spawned a spin out Sync Conf this November in SF (https://syncconf.dev/)
There was a great panel discussion this year from a number of the co-authors of the the paper linked, discussing what is Local-first software in the context of dev tools and what they have learnt since the original paper. It's very much worth watching: https://youtu.be/86NmEerklTs?si=Kodd7kD39337CTbf
The community are very much settling on "Sync" being a component of local first, but applicable so much wider. Along with local first software being a characteristic of end user software, with dev tools - such as sync engines - being an enabling tool but not "local first" in as much themselves.
The full set of talks from the last couple of years are online here: https://youtube.com/@localfirstconf?si=uHHi5Tsy60ewhQTQ
It's an exciting time for the local-first / sync engine community, we've been working on tools that enable realtime collaborative and async collaborative experiences, and now with the onset of AI the market for this is exploring. Every AI app is inherently multi user collaborative with the agents as actors within the system. This requires the tech that the sync engine community has been working on.
There's also an excellent Local First podcast that interviews the engineers building local-first apps and infra!
https://www.localfirst.fm/
Thanks for the info. Didn't know there were sync conferences.
Worth a read, and it's had some very active discussions in the past:
https://news.ycombinator.com/item?id=37743517 - Oct 2023, 50 comments
Anything with online dependencies will necessarily require ongoing upkeep and ongoing costs. If a system is not local-first (or ideally local-only), it’s not designed for long-term dependability.
Connected appliances and cars have got to be the stupidest bit of engineering from a practical standpoint.
The entire thing is because of subscription revenue.
It’s self reinforcing because those companies that get subscription revenue have both more revenue and higher valuations enabling more fund raising, causing them to beat out companies that do not follow this model. This is why local first software died.
I remember seeing somebody summarize this as "SaaS is a pricing model" or "SaaS is financialization" and it totally rings true. Compared to normal software pricing, a subscription gives you predictable recurring revenue and a natural sort of price discrimination (people who use your system more, pay more). It's also a psychological thing: folks got anchored on really low up-front prices for software, so paying $2000 for something up-front sounds crazy even if you use it daily for years, but paying $25/month feels reasonable. (See also how much people complain about paying $60 for video games which they play for thousands of hours!)
It's sad because the dynamics and incentives around clear, up-front prices seem generally better than SaaS (more user control, less lock-in), but almost all commercial software morphs into SaaS thanks to a mix of psychology, culture and market dynamics.
There are other advantages to having your software and data managed by somebody else, but they are far less determinative than structural and pricing factors. In a slightly different world, it's not hard to imagine relatively expensive software up-front that comes with a smaller, optional (perhaps even third-party!) subscription service for data storage and syncing. It's a shame that we do not live in that world.
17 replies →
The root cause of the problem is that it's easier to make personalized stuff with server/backend (?cloud?) than without maybe?
Example: I made a firefox extension that automatically fills forms using LLM. It's fully offline (except OPTIONALLY) the LLM part, optionally because it also supports Ollama locally.
Now the issue is that it's way too hard for most people to use: find the LLM to run, acquire it somehow (pay to run it online or download it to run in Ollama) gotta configure your API url, enter API key, save all of your details for form fulling locally in text files which you then have to backup and synchronize to other devices yourself.
The alternative would be: create account, give money, enter details and all is synced and backedup automatically accross devices, online LLM pre-selected and configured. Ready to go. No messing around with Ollama or openrouter, just go.
I don't know how to solve it in a local way that would be as user friendly as the subscription way would be.
Now things like cars and washing machines are a different story :p
4 replies →
Pretty much greed being a universally destructive force in the world as usual.
When Apple joined the madness, all hopes where lost (that was a long time ago now, sight)
Yeah Dropbox Apple etc. provide enough free or paid storage that shows you the true cost. Circa $10 for 2Tb. Cloudflare let's you host static files pretty much for free. Or cost is rounding error.
So you can run 1000 local first app that syncs to a Dropbox for that 10/m in storage. And that storage is full B2C level ready to go not some low level s3 like primitive. Has auth, has supported has programs to sync.
Really most of the cloud cost is not needed.
Anybody opposing the "Stop Killing Games" initiative should read this comment.
Nobody is forcing anybody to make their games rely solely on online services. It's not a legal requirement, regulatory requirement, or anything else. It is a choice, like most things in software. To make the choice to rely on online services and then say "we'll have to spend money later to unfuck this!" is honestly short sided, pathetic, and nobody should accept it.
This was refreshing to read! More apps should be local-first. If the user does not want to sync their data to cloud, they should have that option.
I’ve been building the offline-first (or local-first) app Brisqi[0] for a while now, it was designed from the ground up with the offline-first philosophy.
In my view, a local-first app is designed to function completely offline for an indefinite period. The local experience is the foundation, not a fallback and cloud syncing should be a secondary enhancement, not a requirement.
I also don’t consider apps that rely on temporary cache to be offline-first. A true offline-first app should use a local database to persist data. Many apps labeled as “offline-first” are actually just offline-tolerant, they offer limited offline functionality but ultimately depend on reconnecting to the internet.
Building an offline-first app is certainly more challenging than creating an online-only web app. The syncing mechanism must be reliable enough to handle transitions between offline and online states, ensuring that data syncs to the cloud consistently and without loss. I’ve written more about how I approached this in my blog post[1].
[0] https://brisqi.com
[1] https://blog.brisqi.com/posts/how-i-designed-an-offline-firs...
How has it been going? I've been thinking of trying this model but a bit worried about how much harder it would be to make it sustainable as a business
If I were to do it all over again, I’d simplify things even further. I’d start by testing the idea with just the offline version. For syncing, I’d focus on syncing the data itself, just the database records and not the user actions. Instead of replaying every action, I’d track which records were modified and sync only those. No need to capture what changed, just that record got changed. Hopefully that makes sense.
The business is doing alright, but what really keeps me going is that the app is something I genuinely wanted for myself. I use it all the time, it's always open in the background, so that keeps me motivated.
Cool to see principles behind this, although I think it’s definitely geared towards the consumer space. Shameless self plug, but related: we’re doing this for industrial assets/industrial data currently (www.sentineldevices.com), where the entire training, analysis and decision-making process happens on customer equipment. We don’t even have any servers they can send data to, our model is explicitly geared on everything happening on-device (so the network principle the article discussed I found really interesting). This is to support use cases in SCADA/industrial automation where you just can’t bring data to the outside world. There’s imo a huge customer base and set of use cases that are just casually ignored by data/AI companies because actually providing a service where the customer/user is is too hard, and they’d prefer to have the data come to them while keeping vendor lock-in. The funny part is, in discussions with customers we actually have to lean in and be very clear on “no this is local, there’s no external connectivity” piece, because they really don’t hear that anywhere and sometimes we have to walk them through it step by step to help them understand that everything is happening locally. It also tends to break the brains of software vendors. I hope local-first software starts taking hold more in the consumer space so we can see people start getting used to it in the industrial space.
It doesn't help that all the SCADA vendors are jumping on the cloud wagon and trying to push us all in that direction. "Run your factory from your smartphone!" Great, now I'm one zero-day away from some script kiddie playing around with my pumps.
An exciting space and I'm glad you and your team are working in it.
I looked over your careers page and see all of your positions are non-remote. Is this because of limitations of working on local-first software require you to be in-person? Or is this primarily a management issue?
Personally, I disagree with this approach. This is trying to solve a business problem (I can't trust cloud-providers) with a technical trade-off (avoid centralized architecture).
The problems with closed-source software (lack of control, lack of reliability) were solved with a new business model: open source development, which came with new licenses and new ways of getting revenue (maintenance contracts instead of license fees).
In the same way, we need a business model solution to cloud-vendor ills.
Imagine we create standard contracts/licenses that define rights so that users can be confident of their relationship with cloud-vendors. Over time, maybe users would only deal with vendors that had these licenses. The rights would be something like:
* End-of-life contracts: cloud-vendors should contractually spell out what happens if they can't afford to keep the servers running.
* Data portability guarantees: Vendors must spell out how data gets migrated out, and all formats must be either open or (at minimum) fully documented.
* Data privacy transparency: Vendors must track/audit all data access and report to the user who/what read their data and when.
I'm sure you can think of a dozen other clauses.
The tricky part is, of course, adoption. What's in it for the cloud-vendors? Why would they adopt this? The major fear of cloud-vendors is, I think, churn. If you're paying lots of money to get people to try your service, you have to make sure they don't churn out, or you'll lose money. Maybe these contracts come only with annual subscription terms. Or maybe the appeal of these contracts is enough for vendors to charge more.
> This is trying to solve a business problem (I can't trust cloud-providers) with a technical trade-off (avoid centralized architecture).
Whenever it's possible to solve a business problem or political problem with a technical solution, that's usually a strong approach, because those problems are caused by an adversarial entity and the technical solution is to eliminate the adversarial entity's ability to defect.
Encryption is a great example of this if you are going to use a cloud service. Trying to protect your data with privacy policies and bureaucratic rules is a fool's errand because there are too many perverse incentives. The data is valuable, neither the customer nor the government can easily tell if the company is selling it behind their backs, it's also hard to tell if he provider has cheaped out on security until it's too late, etc.
But if it's encrypted on the client device and you can prove with math that the server has no access to the plaintext, you don't have to worry about any of that.
The trouble is sometimes you want the server to process the data and not just store it, and then the technical solution becomes, use your own servers.
I 100% agree, actually. If there were a technical solution, then that's usually a better approach.
For something like data portability--being able to take my data to a different provider--that probably requires a technical solution.
But other problems, like enshittification, can't be solved technically. How do you technically prevent a cloud vendor from changing their pricing?
And you're right that the solution space is constrained by technical limits. If you want to share data with another user, you either need to trust a central authority or use a distributed protocol like blockchain. The former means you need to trust the central provider; the latter means you have to do your own key-management (how much money has been lost by people forgetting the keys to their wallet?)
There is no technical solution that gets you all the benefits of central plus all the benefits of local-first. There will always be trade-offs.
2 replies →
Does this really solve the problem? Let's say I'm using a cloud provider for some service I enjoy. They have documents that spell out that if they have to close their doors they will give X months of notice and allow for a data export. Ok, great. Now they decide to shut their doors and honor those agreements. What am I left with? A giant JSON file that is effectively useless unless I decide to write my own app, or some nice stranger does? The thought is there, it's better than nothing, but it's not as good as having a local app that will keep running, potentially for years or decades, after the company shuts their doors or drops support.
Data portability is, I think, useful even before the service shuts down. If I'm using some Google cloud-service and I can easily move all my data to a competing service, then there will be competition for my business.
What if cloud platforms were more like brokerage firms? I can move my stocks from UBS to Fidelity by filling out a few forms and everything moves (somewhat) seamlessly.
My data should be the same way. I should be able to move all my data out of Google and move it to Microsoft with a few clicks without losing any documents or even my folder hierarchy. [Disclaimer: Maybe this is possible already and I'm just out of the loop. If so, though, extend to all SaaS vendors and all data.]
1 reply →
This would make cloud vendors kind of like banks. The cloud vendor is holding a kind of property for the user in the user's account. The user would have clearly defined rights to that property, and the legal ability to call this property back to themselves from the account.
This calling back might amount to taking delivery. In a banking context, that is where the user takes delivery of whatever money and other property is in the account. In the cloud vendor case, this would be the user receiving a big Zip file with all the contents of the account.
Taking delivery is not always practical and is also not always desirable. Another option in a financial context is transferring accounts from one vendor to another: this can take the form of wiring money or sometimes involves a specialized transfer process. Transferring the account is probably way more useful for many cloud services.
This leads us to a hard thing about these services, though: portability. Say we delineate a clear property interest for user's in their cloud accounts and we delineate all of their rights. We have some good interests and some good rights; but what does it mean to take delivery of your Facebook friends? What does it mean to transfer your Facebook account from one place to another?
> This is trying to solve a business problem (I can't trust cloud-providers) with a technical trade-off (avoid centralized architecture).
I don't think that's quite correct. I think the authors fully acknowledge that the business case for local-first is not complexly solved and is a closely related problem. These issues need both a business and technical solution, and the paper proposes a set of characteristics of what a solution could look like.
It's also incorrect to suggest that local-first is an argument for decentralisation - Martin Kleppmann has explicitly stated that he doesn't think decentralised tech solves these issues in a way that could become mass market. He is a proponent of centralised standardised sync engines that enable the ideals of local-first. See his talk from Local-first conf last year: https://youtu.be/NMq0vncHJvU?si=ilsQqIAncq0sBW95
I'm sure I'm missing a lot, but the paper is proposing CRDTs (Conflict-free Replicated Data Types) as the way to get all seven checkmarks. That is fundamentally a distributed solution, not a centralized one (since you don't need CRDTs if you have a central server).
And while they spend a lot of time on CRDTs as a technical solution, I didn't see any suggestions for business model solutions.
In fact, if we had a business model solution--particularly one where your data is not tied to a specific cloud-vendor--then decentralization would not be needed.
I get that they are trying to solve multiple problems with CDRTs (such a latency and offline support) but in my experience (we did this with Groove in the early 2000s) the trade-offs are too big for average users.
Tech has improved since then, of course, so maybe it will work this time.
> * Data portability guarantees: Vendors must spell out how data gets migrated out, and all formats must be either open or (at minimum) fully documented.
This is not practical for data of any size. Prod migrations to a new database take months or even years if you want things to go smoothly. In a crisis you can do it in weeks but it can be really ugly, That applies even when moving between the same version of open source database, because there's a lot of variation between the cloud services themselves.
The best solution is to have the data in your own environment to begin with and just unplug. It's possible with bring-your-own-cloud management combined with open source.
My company operates a BYOC data product which means I have an economic interest in this approach. On the other hand I've seen it work, so I know it's possible.
I'd love to know more about BYOC. Does that apply to the raw data (e.g., the database lives inside the enterprise) or the entire application stack (e.g., the enterprise is effectively self-hosting the cloud).
It seems like you'd need the latter to truly be immune to cloud-vendor problems. [But I may not understand how it works.]
1 reply →
> Personally, I disagree with this approach. This is trying to solve a business problem (I can't trust cloud-providers)
It is not only a business problem. I stay away from cloud based services not only because of subscription model, but also because I want my data to be safe.
When you send data to a cloud service, and that data is not encrypted locally before being sent to the cloud (a rare feature), it is not a question of if but when that data will be pwned.
I have spent the last decade or so working in digital forensics and incident response for a series of well-known SaaS companies.
The experience has made me a big fan of self hosting.
"Trust about whether or not another company will maintain confidentiality" still sounds like a business problem to me (or at least one valid way of perceiving the problem)
And the biggest advantage I see of this perspective over the "technical problem" perspective is that assigning responsibility completely covers the problem space, while "hope that some clever math formula can magic the problem away" does not.
1 reply →
> End-of-life contracts: cloud-vendors should contractually spell out what happens if they can't afford to keep the servers running.
I'm trying to imagine how this would be enforced when a company shutters and it's principals walk away.
Putting stuff in escrow is usually the way to go: escrow service is paid upfront (say, always for the next 3 months), and that's the time you've got to pull out your data.
My company does that with a few small vendors we've got for the source code we depend on.
It's a good question--I am not a lawyer.
But that's the point of contracts, right? When a company shuts down, the contracts become part of the liabilities. E.g., if the contract says "you must pay each customer $1000 if we shut down" then the customers become creditors in a bankruptcy proceeding. It doesn't guarantee that they get all (or any) money, but their interests are negotiated by the bankruptcy judge.
Similarly, I can imagine a contract that says, "if the company shuts down, all our software becomes open source." Again, this would be managed by a bankruptcy judge who would mandate a release instead of allowing the creditors to gain the IP.
Another possibility is for the company to create a legal trust that is funded to keep the servers running (at a minimal level) for some specified amount of time.
2 replies →
(cont. thinking...) One possibility. A 3rd party manages a continually updating data escrow. It'd add some expense and complexity to the going concern.
> Vendors must spell out how data gets migrated out, and all formats must be either open or (at minimum) fully documented.
Anecdotally, I’ve never worked anywhere where the data formats are documented in any way other than a schema in code,
> This is trying to solve a business problem (I can't trust cloud-providers)
Not necessarily. I like local-first due to robust syncing via CRDTs, not because I somehow want to avoid cloud providers.
A good contract can help you to seek some restitution if wrongdoing is done and you become aware of it and you can prove it. It won't mechanically prevent the wrongdoing from happening.
It can also help to align the incentives of multiple parties to actually care about the same goals.
"Mechanically preventing wrongdoing from happening" can be a bit of a Shangri-La. What Tech can mechanically do is increase the cost of wrongdoing, or temporarily deflect attempts towards easier targets. But that by definition cannot "solve the problem for everyone" as there will always be a lowest hanging fruit remaining somewhere.
What contracts can do is help to reduce the demand for wrongdoing.
Currently there are laws but not for hosting. Look at the contract of Steam for example or Ubisoft, or anything else - Q: What happens to your game collection if we shut down our servers? A: You own nothing and lose everything, GG!
It is like that we must protect users privacy from greedy websites so we will make the bad ones spell out that they use cookies to spy on users - and the result is what we have now with the banners.
I agree with you! And your point about cookie banners underlines that we can't just rely on regulation (because companies are so good are subverting or outright lobbying their way out of them).
Just as with the open source movement, there needs to be a business model (and don't forget that OSS is a business model, not a technology) that competes with the old way of doing things.
Getting that new business model to work is the hard part, but we did it once with open source and I think we can do it again with cloud infrastructure. But I don't think local-first is the answer--that's just a dead end because normal users will never go with it.
1 reply →
Is it trying to solve a business problem? I think it's trying to solve a more general problem which has nothing to do with business.
It's ok to just solve the problem and let the businesses fail. Predation is healthy for the herd. Capitalism finds a way, we don't have to protect it.
That’s essentially what I’m trying to make widely available through my projects https://github.com/ibizaman/selfhostblocks and https://github.com/ibizaman/skarabox. Their shared goal is to make self-hosting more approachable to the masses.
It’s based on NixOS to provide as much as possible out of the box and declaratively: https, SSO, LDAP, backups, ZFS w/ snapshots, etc.
It’s a competitor to cloud hosting because it packages Vaultwarden and Nextcloud to store most of your data. It does provide more services than that though, home assistant for example.
It’s a competitor to YUNoHost but IMO better (or aims to be) because you can use the building blocks provided by SelfHostBlocks to self-host any packages you want. It’s more of a library than a framework.
It’s a competitor to NAS but better because everything is open source.
It still requires the user to be technical but I’m working on removing that caveat. One of my goals is to allow to install it on your hardware without needing nix or touching the command line.
Love it! I've been thinking about this a lot lately. It's crazy how many great FOSS alternatives are out there to everything – and while they might be relatively easy to install for tech-people ("docker compose up"), they are still out of reach for non-tech people.
Also, so many of these selfhostable apps are web applications with a db, server and frontend, but for a lot of use cases (at least for me personally) you just use it on one machine and don't even need a "hosted" version or any kind of sync to another device. A completely local desktop program would suffice. For example I do personal accounting once a month on my computer – no need to have a web app running 24/7 somewhere else. I want to turn on the program, do my work, and then turn it off. While I can achieve that easily as a developer, most of the people can't. There seems to be a huge misalignment (for lack of a better word) between the amount of high-quality selfhostable FOSS alternatives and the amount of people that can actually use them. I think we need more projects like yours, where the goal is to close that gap.
I will definitely try to use selfhostblocks for a few things and try to contribute, keep it up!
My guess as to why most apps are now a web UI on top of a DB is because it’s easy to “install”. SelfHostBlocks is admittedly geared towards a central server serving web apps. Or at least apps with a desktop or mobile component but geared towards synching to a central server.
Feel free to give it a try though, I’d love that! Also feel free to join the matrix channel UF you have any questions or just to get some updates.
2 replies →
I love that you include hledger! It's amazing piece of software, even if a little obscure for people unfamiliar with plaintext accounting!
I love that application. I plan to make some improvements to the web UI. I’d love to have multiple tabs with saved reports. That would allow my spouse to use it quite easily. I’ll be adding that at some point.
Looks really neat! Thanks for building this
Thank you for the kind words :)
There is no reason for every application to have its own sync platform. I suspect this framing came out of mobile apps where there is no composability or modularity between programs.
If you really embrace "local first" just use the file system, and the user can choose from many solutions like git, box, etc.
I hate signing up for your sync just as much as any other SAAS, but it's even more opaque and likely to break.
I agree that not every app needs it's own sync engine, but I disagree with your framing that the file system is the universal way to embrace local first. I have two reasons.
First is that yeah, local first, but I also want concurrency. If it's just local first, you're right, any old sync will do. But I want more than that. I want to not have to think (a la dropbox, being slick). I want my wife and I to be able to make separate edits on our phones when we're in a dead zone.
Second is that sync works a lot better when it has deep knowledge of the data structure and semantics. Git and box both have significant shortcomings, but both exacerbated by the concurrency desire.
But this problem isn't going to be solved by every app making its own sync system. Even if there is a magic library you can adopt that does pretty good, then everyone having their own completely independent hosting solution and sync schedule.
If files are insufficient, what data-structure would make modular sync possible for multiple applications in an OS?
And I’m not suggesting one doesn’t exist, I’m challenging to present a comprehensive solution, that probably involved operating systems.
> I want my wife and I to be able to make separate edits on our phones when we're in a dead zone.
Files do this.
4 replies →
If the app is designed for it you can use a hybrid approach, where a given "document" is stored in 1 file for each client, and the client merges the changes across all files. That way there's never a change conflict that something like Dropbox needs to handle and it can all be offloaded to the app.
I mostly agree with this, but sometimes it's not that simple in practice. I created an app that did exactly this and it resulted in inevitable file conflicts because I couldn't negotiate between the clients when a file should be allowed for editing.
In theory, I love the local-first mode of building. It aligns well with “small tech” philosophy where privacy and data ownership are fundamental.
In practice, it’s hard! You’re effectively responsible for building a sync engine, handling conflict resolution, managing schema migration, etc.
This said, tools for local-first software development seem to have improved in the past couple years. I keep my eye on jazz.tools, electric-sql, and Rocicorp’s Zero. Are there others?
CouchDB on the server and PouchDB on the client was an attempt at making such an environment:
- https://couchdb.apache.org/
- https://pouchdb.com/
Also some more pondering on local-first application development from a "few" (~10) years back can be found here: https://unhosted.org/
And RxDB. https://rxdb.info/
Using Couch/Pouch on our current app for this reason. Great to work with. Though we’re not supporting offline-first right away (depends on external services), it’s going to help with resilience and a future escape valve.
Lotus Notes always deserves a mention in these threads too, as 1989's answer to local-first development. CouchDB was heavily inspired by Notes.
I've been using instantdb in anger for the past month or so for a side project of mine. I'm building a personal budget app.
I should probably write a blog post, but I will say that I investigated power sync, electricSQL, livestore and powersync before. I briefly looked at jazz tools but wanted something a bit more structured.
I'm pretty impressed this far. I've actually been writing it with Vue and a community library. Permissions were a bit tricky, but once I figured it out it was simple. And I like their magic email login. And I like their dashboard/reply, but there are a few big changes I would make there to make it less fiddly.
I love that it's open source, and that if I want to, I could self host it.
As for the other options:
- jazz wasn't structured enough
- livestore came off as too fiddly with the event store, but it was appealing. That the dev tools are payealled was disappointing, but understandable
- electriSQL really only provided half a solution (read, not the write model
- couchDB / pouchDB wasn't structured enough for me, and I wanted better cross document support than was obvious / baked in.
- did not investigate zero really
+1 for instant! Been using it and I find it a breeze to work with, definitely filling the exact niche this article was discussing. sync engines are the future!
[Instant founder]
Brightened reading this. If you have any feedback please let us know! We on the discord, and answer over on founders@instantdb
Do you know that website? https://www.localfirst.fm
EDIT: actually I wanted to point to the "landscape" link (in the top menu) but that URL is quite unergonomic.
No, I didn't know about it -- thank you! (EDIT: and the landscape page has lots of libraries I hadn't run across before. Neat.)
1 reply →
I think I saw someone point out automerge not long ago:
https://automerge.org/
Rust and JavaScript implementations, a handful of network strategies. It doesn't come with the free or paid offering that jazz.tools does, but it's pretty nice.
I like https://loro.dev personally, also in Rust and JS. Many such CRDTs are being built in Rust these days.
You might also want to check out Ditto:
https://www.ditto.com
It’s a local-first platform that supports real-time sync with CRDTs at its core, making conflict resolution much easier to manage. Ditto is designed to handle offline-first use cases and peer-to-peer sync out of the box, so you don’t have to build a custom sync engine from scratch.
It supports a wide range of platforms including Swift, Kotlin (Android), Flutter/Dart, React Native, JavaScript (Web/Node), .NET (C#), C++, Java, and Rust. You can dive deeper into what it offers from the docs site: https://docs.ditto.live/home/about-ditto
Along with the others mentioned, it's worth highlighting Yjs. It's an incredible CRDT toolkit that enables many of the realtime and async collaborative editing experience you want from local-first software.
https://yjs.dev/
I’ve built several apps on yjs and highly recommend it. My only complaint is that storing user data as a CRDT isn’t great for being able to inspect or query the user data server-side (or outside the application). You have to load all the user’s data into memory via the yjs library before you can work with any part of it. There are major benefits to CRDTs but I don’t think this trade-off is worth it for all projects.
I use local software and sync files using git or sometimes fossil (both work fine in Android with termux for instance, for stuff In want to access on my phone). I don't host servers or use any special software that requires syncing data in special ways.
There are a bunch and quite a breadth of different solutions/takes on the problem.
Here is a good recap of the current players. https://www.localfirst.fm/landscape
There's also PowerSync: https://www.powersync.com/
It's also open source and has bindings for Dart, JS, Swift, C#, Kotlin, etc
This site also has a directory of devtools: https://lofi.so/
I've always thought that this article overstates the promise of CRDTs with regard to conflict resolution. For toy cases like a TODO list, yes, you can define your operations such that a computer can automatically reconcile conflicts - e.g. you only support "add" and "mark as complete", and if something gets marked as complete twice, that's fine.
But once you get past toy examples, you start wanting to support operations like "edit", and there generally isn't a way to infer the user's intent there. Like, if my cookie recipe starts with 100g of sugar, and I modify it on my phone to use 200g of sugar, and I modify it on my desktop to use 150g of honey instead of 100g of sugar, there are a bunch of ways to reconcile that:
1. Stick with 200g of sugar, drop the 1.5x honey substitution.
2. Stick with 150g of honey, drop the 2x.
3. Merge them - 300g of honey.
4. Merge them - 150g of honey and 50g of sugar.
There's no way for any automated system to infer my intent there. So you've got to either:
1. Ask the user to resolve the conflict. This means you have to build out the whole "resolve this merge conflict for me" UI and the promise of "conflict-free" has not been fulfilled.
2. Arbitrarily choose an option and silently merge. This risks badly surprising the user and losing changes.
3. Arbitrarily choose an option, but expose the fact that you've auto-resolved a conflict and allow the user to manually re-resolve. This requires even more UI work than option 1.
4. Constrain your data model to only allow representing intents that can be deterministically resolved. In practice I think this is too severe of a constraint to allow building anything other than toy apps.
IMO #1 and #3 are the least-bad options, but I don't think they're consistent with the expectations you'd have for CRDTs after reading this article.
(FWIW, https://automerge.org/docs/reference/documents/conflicts/ is the relevant documentation for their Automerge library. It looks like they've chosen option 3.)
Arbitrarily choose an option, but expose the fact that you've auto-resolved a conflict and allow the user to manually re-resolve. This requires even more UI work than option 1.
This is what every "cloud file sharing" provider like Dropbox is doing. If there is a conflict, the version on the server is "the right one", and your locally conflicted file is copied on the side with some annotation in the file name.
Yeah, but Dropbox is sort of playing on easy mode, because the data is "just files" and you can manually resolve the conflict with regular old text editors, etc. If you don't expose your app's data model in the file system (and on a phone you generally wouldn't), that means you need to write something custom to resolve the conflicts.
"it is desirable for the software to run as a locally installed executable on your device, rather than a tab in a web browser."
An OS agnostic apps, meaning web apps is such a killer feature. You can use the same app on: Linux, Mac OS, Windows, Android, iOS
Even more, developing for web is typically faster. You made the change in the code => you see the result on the screen. For example: phone apps written in swift could be faster than ones written in react-native, but it is so annoying waiting for the compilation to finish after every small change.
----
When I worked on imagery data in an autonomous vehicles company product managers pushed us to explore the data in the cloud and it was soooo inconvenient.
As the result, PMs were ignored and everyone had a personal desktop with GPUs and fast SSDs that had the local copy of the data, so that debugging, prototyping would be fast.
As lag that one gets working with a heavy data remotely reminded moving back from SSD to slow HDD, where you needed to wait some time to see the result on the screen.
It was only half a second every time, but felt ultra annoying.
Self hosting (which is often adjacent to local-first software) is fine. I've done it for years.
But it is a nightmare when it goes wrong: the conclusion I've reached is that it is out of reach to regular people who don't want the Byzantine support load that could accompany something going wrong. They want turnkey. They want simple. They aren't interested in operating services, they're interested in using them.
The FLOSS model of self hosting doesn't really offer a reliable way of getting this: most businesses operating this way are undercapitalised and have little hope of ever being any other way. Many are just hobbies. There are a few exceptions, but they're rare and fundamentally the possibility of needing support still exists.
What is needed, imo, is to leverage the power of centralised, professional operations and development, but to govern it democratically. This means cooperatives where users are active participants in governance alongside employees.
I've done a little work towards this myself, in the form of a not-yet-seen-the-light-of-day project.
What I'd love to see is a set of developers and operators actually getting paid for their work and users getting a better deal in terms of cost, service, and privacy, on their own (aggregate) terms. Honestly, I'd love to be one of them.
Does anyone think this has legs to the same extent as local-first or self hosting? Curious to know people's responses.
This is the business model I want to have: I work on a stack of fully open source software and package them in a turn-key server that you own. You can use it on your own for free if you’re knowledgeable and I offer a subscription where I’m the sysadmin of the box you own and that I built for you. I do the maintenance, the updates, etc. There’s no lock-in because you can stop the subscription anytime or even just pick another sysadmin that would know the stack. The only reason you’d keep me around would be that the service I offer is pretty damn good. Would something like that appeal to you?
I was about to suggest that a better, more open, and fair form of capitalism would need to be used as a tool...but then, re-reading your comment - "...leverage the power of centralised, professional operations and development, but to govern it democratically..." - i think you better encapsulate what i meant to convey. :-)
That being said, yes, i do believe *in the near/upcoming future* local-first, self-hosting and i will add more fair open source vendors will work! Well, at least, i hope so! I say that because Europe's recent desire to pivot away from the big U.s. tech companies, and towards more digital sovereignty - in my opinion - begins the foundational dependency for an ecosystem that will/could sustain self hosting, etc. The more that europe is able to pivot away from big tech, the more possibilty exists for more and varied non-big tech vendors manifest...and the more that Europe adopts open source, the more the possibility that usage and expertise of self-hosting grows....plus, for those who do not know how to, or simply do not wish to manage services themselves...well, in time i think Europe will have fostered a vast array of vendors who can provide such open source, digital services, but get paid a fair cost for providing fair value/services, etc. ...and, by the way, i say this all as a biased person in favor of open source AS WELL AS being an American. :-)
> What is needed, imo, is to leverage the power of centralised, professional operations and development, but to govern it democratically. This means cooperatives where users are active participants in governance alongside employees.
Utopia. Unattainable. Self-determination of the individual has been consistently persecuted under all societal arrangements; communism and capitalism equally hate a citizen that wants to remain independent and self-sufficient.
This comment ends just when it was getting good.
We need a term for a viable business model to pair with local-first tech.
I've been working on Relay [0] (realtime multiplayer for Obsidian) and we're trying to follow tailscale's approach by separating out the compute/document sync from our auth control plane.
This means thats users still subscribe to our service (and help fund development) and do authn/authz through our service, but we can keep their data entirely private (we can't access it).
[0] https://relay.md
Relay user here! It’s great. Quite reliable for an early product.
Thanks for the kind words
Are you requiring a google account for file/folder based auth on a per user bases for a vault? Not to keen on using a 3rd party for this kind of thing.
For our free/individual plan we do use OAuth2 providers (currently only Google is enabled, but considering others), and can support other methods for larger teams (like oidc).
Originally the idea was to keep everything within the Obsidian UI so things like username/password didn't make sense (no password managers in Obsidian).
We initiate the OAuth2 login flow from within Obsidian. I guess we could add an extra click that takes you to our website first and then add support more auth methods from there. I don't really want it to feel like a web app though.
I'd love to hear your take. Which login method do you think is both simple and could be coherently used within Obsidian on all platforms?
3 replies →
I've been building exactly this with SoundLeaf [0] - an iOS client for the excellent open-source Audiobookshelf server. No data collection, no third-party servers, just your audiobooks syncing directly with your own instance.
The user-friendliness challenge is real though. Setting up Audiobookshelf [1] is more work than "just sign up," but once you have it running, the local-first client becomes much cleaner to build. No user accounts, no subscription billing, no scaling concerns. Simple pricing too: buy once, own forever. No monthly fees to access your own audiobooks.
[0] https://soundleafapp.com
[1] https://github.com/advplyr/audiobookshelf
The primary challenge with building local first software is the sync layer. The current 3rd party offerings are not mature. And people have been working on these for a few years. Electric SQL comes to mind.
I'm curious on what you experienced that caused you to come to the conclusion that 3rd party sync solutions are not mature. There are several 3rd party vendors like Ditto that have been building local first sync solutions that are used by very large companies with success.
As a local-first developer, I'd say the biggest challenge is p2p. Or more specifically, NAT traversal and the need of a TURN server.
I've been wanting a computing model I call PAO [1] for a long time. PAO would run personal application "servers" and connect dynamic clients across all devices. PAO is centralized, but centralized per user, and operating at their discretion. It avoids synchronization, complex concurrent data structures, and many other problems associated with alternatives. Its weakness is a need for always-on networks, but that complication seems ever easier to accept as omnipresent networks become realistic.
[1] https://tiamat.tsotech.com/pao (2012)
It's a very exciting moment for this movement. A lot of the research and tech for local-first is nearing the point that it's mature, efficient, and packaged into well designed APIs.
Moreover, local-first —at least in theory— enables less infrastructure, which could reignite new indie open source software with less vendor lock-in.
However, despite all my excitement about embracing these ideas in the pursuit of better software, there's one hurdle that preventing more wide spread adoption amongst developers, and that is the Web platform.
The Web platform lacks building blocks for distributing hashed and/or signed software that isn't tied to origins. In other words, it's hard to decouple web-apps from the same-origin model which requires you set up a domain and serve requests dynamically.
Service Workers and PWAs do help a bit in terms of building offline experiences, but if you want users to download once, and upgrade when they want (and internet is available), you can't use the Web. So you end up breaking out of the browser, and start using Web technologies outside of the browser with better OS functionality, like Electron, React Native, Tauri et al (the https://userandagents.com/ community is doing some cool experiments in this space).
We need to get back to apps rather than webapps. The hardware compatibility issues of the past are basically no longer here, and there are three major OS types two of which can use each other's apps.
Pretty much the opposite. Local-first makes web apps feel just like apps, without the native-apps security risks.
Perhaps, but then how will they be authored? In what language and with what GUI toolkit?
I view everyone flocking around Electron as proof of a failure on this front.
I recently started using Typst instead of Pandoc->LaTeX.
I held off on playing with Typst for years because I was under the (incorrect) impression that the only way to use it was with their web editor. I'm sure that their editor is completely fine, but I am pretty entrenched in Neovim and Pandoc had been serving me well.
Once I found out that Typst has a command line version that I can use directly, it became more appealing, because I'm pretty sick of cloud shit.
The data part aside, and specifically on the platform/functionality side - these cloud/large products unfortunately do offer more powerful/advanced features, or convenience. Be it cloud multi-device functionality that makes moving around and collaborating seamless, or to enterprise products like snowflake and fabric that offers all sorts over a standard mssql db.
I'm personally very against vendor lock in, but there is some value to them.
> "we have gone further than other projects down the path towards production-ready local-first applications based on CRDTs"
This seems like a bold claim, but IMHO Ink & Switch have earned their solid reputation and it wouldn't surprise me if it's true. I agree w/ their analysis and am philosophically aligned w/ their user-centric worldview. So who's going to build "Firebase for CRDTs"?
> Firebase for CRDTs
Do you actually need anything special for CRDTs over a normal database? My understanding is the actual CRDT part is done "client side"
I was just referring to the posted article's assertion that "Firebase for CRDTs" is a huge opportunity. I think I agree w the authors that a well-architected CRDT solution for local-first apps requires capabilities not currently provided by Firebase or any other vendor. But I'm no expert.
Interesting to think about the concept of local-first in the age of AI.
Wanting to be able to run AI fully privately, and offline is the reason we created Cactus:
https://github.com/cactus-compute/cactus
Fully open-source, cross-platform & blazing-fast; lets you plug in private AI into any app on your phone.
Complete agreement. Here's a brief, practical action plan for Windows users:
Remember this -- cloud storage only has advantages for Microsoft and law enforcement (which have a number of easy ways to gain access to your documents compared to local storage). For a Windows user, cloud storage is the ultimate Dark Pattern.
This reminds me of my own painful story: I once made a local photo search app called Queryable that ported OpenAI's CLIP model to iPhone, letting you search your photos with queries like "a black cat sitting on a sofa."
Since it needed to access users' local photo libraries, I didn't want the app to connect to the internet under any circumstances. So I made it a paid app instead of the usual free+in-app purchases model, since the latter requires calling StoreKit which goes online. But because the app had to run the CLIP model, it would crash on lower-performance phones like the iPhone X. Users who paid for it couldn't use it and felt scammed, leading to tons of one-star reviews and angry complaints about their photos being stolen. Eventually I decided to open-source the app, though it never brought me much revenue anyway.
Two years later, Apple started announcing they'd be integrating this exact feature into Apple Intelligence : )
Couldn’t you have just restricted the app to being installable on only certain iPhone models?
Apple doesn't allow developers to target specific device models, presumably to prevent discrimination. However, you have two options: 1. Set a minimum iOS version requirement, or 2. Restrict to devices with A12 chips or later. But neither approach can exclude certain problematic device models.
Lately, I have been following this approach and going towards local-first software. I like simple softwares with barebone features.
- Password manager: KeyPassXC
- Notes: Logseq
- Analytics: Plausible
- Media: Jeyllyfin
- Uptime kuma
- Finance tracker: Actual Budget etc is too heavy so I built this. https://github.com/neberej/freemycash/
- Search: Whoogle? is kinda dead. Need alternative.
For passwords: Enpass is also a wonderful local-first password manager (with optional LAN or cloud sync options)
What are the top web local first frameworks worth checking out these days? i’ve heard of livestore, tanstack DB with electric, zero. any others that are easy to use and flexible? use case is multiplayer apps and maybe games. thanks!
multisynq.io Formerly croquet.io. A team Alan Kay project. Dead simple. Synchronized execution, synchronized data comes along for free.
The old model—a one-time purchase, local install, full user control—worked because devs could sell boxed software at scale. Now, that model collapses unless someone’s willing to either Undervalue their own labour, or treat the software like a public good, absorbing the long tail of maintenance with no recurring income.
The article posits it as though subscription software is something which has been sneaked in on us. But users today expect things like instant updates, sync across devices, collaboration, and constant bug fixes and patches - none of which come easily if you're only willing to pay for the system once.
> as though subscription software is something which has been sneaked in on us
Oh but it has (IMO).
> users today expect things like instant updates [...] constant bug fixes and patches
Nah, this is in reverse. With boxed software, the developer had to deliver an essentially bug-free product. Now, with easy updates technically possible, the developers have gone complacent, and deliver shit. That is why users expect bugfixes instantly. (And any enlightened user abhors unrequested features, as there are no features without regressions, and who wants regressions in any serious application?) The only tolerable online updates are security fixes.
> sync across devices, collaboration
This is a valid expectation, but its execution has been a train-wreck. Research, design and implementation should start with end-to-end encryption; the network architecture should be peer-to-peer (mesh, not centralized). What do we get instead? More centralization of control than ever, and less privacy and ownership than ever.
Generally that's not how I remember it - third party software on the Mac at least got some kind of a beach-head because Windows software was full of bugs, crashes, corrupted files, drivers that never worked, and patch CDs mailed to enterprise customers like they were firmware apologies. Own your own software, taken to its logical endpoint, was a shareware nightmare.
The old model of boxed updates is still in use by some companies today, JetBrains comes to mind. In either case you tuck major new features in a new major version or rolling yearly releases and sell the customer a license to the software that gets a year of updates. In a similar vein many apps I use on my Mac have monthly subscriptions but cancelling them limits their use to essentially one device, but doesn't remove the application or my access to the data.
> treat the software like a public good, absorbing the long tail of maintenance with no recurring income.
Good point. Governments would do this if they really worked "for the people"
Most of that stuff was very much over engineered in the last two decades.
The backend for my personal notes, tasks, bookmarks, calendar and feeds are files in directories synced with Syncthing across devices.
I ended there after going from one app to another and being tired of all this.
It is self hosted with no server backend (beyond a Syncthing on a NAS or VPS, optional). It is very reliable and works without Internet connection.
I could have put everything in sqlite too and sync it one way or another, but it seemed already too complicated for my requirements.
I can't share it beyond my close relatives but I had the same problem with people using Google or Microsoft before.
We have been building a local-first browser app (PWA) for personal finance, based on double-entry accounting. https://finbodhi.com/
We do use online services like firebase for auth, and some service to fetch commodity prices etc, but rest of the data is stored in browser storage (sqlite) and backed to local disk (and soon dropbox). We also syncs data across devices, always encrypting data in transit.
I think it's the way to go, for most personal data applications.
Try demo doesn't work on my iphone, it keeps spinning forever. Plus please take into consideration of removing the friction via signup, if it's real local first you don't need accounts in the cloud
It might be that you’re trying it in Safari’s private mode. It works in regular mode, but private mode blocks storage, so the app can’t function there. We should improve the error message to make that clearer.
That said, the app is currently designed for desktop use—mobile UX is still on our roadmap.
As for signup: it helps us track and bill users. Building our own auth system for local-first would’ve been a full project on its own. Until better options exist for authorization and billing in local-first apps, we’ll stick with a cloud signup flow.
Remember when the justification for cloud was "Your work is not trapped on one device". Well, turns out your cloud data is trapped on one device, AND it's not under your control.
I've made a local first, end-to-end encrypted, auto sync bookmark extension that doesn't milk your data in any way. It's 100% private, I even don't use Google analytics on my website. Some of the reasons why I've put some work into this is:
After about 4 years of using it daily on every pc I own, I found out it's a pain for me and my family when it is not installed on a browser. I thought; if it's useful for us, it might be useful for others too! So, I decided to make it available by subscription for a small fee to cover the server and other costs. I'm not really into marketing, so almost no one knows it exists. You can find it on markbook.io.
I feel like local-first or offline-first can be seen as something catered to niche users when it's brought up in front of strategy or planning folks — they imagine most people having good, reliable internet all the time. The truth is always more frustrating to account for. It is extremely frustrating to see the falloff in functionality in apps when internet is spotty or slow, which happens a lot. Try doing anything in most apps on the subway in New York (where there isn't service in most tunnels) and you'll feel the pain. Or, try doing anything in a crowd and the cell towers are saturated. Fastmail's client won't show you emails without internet, Balatro hangs while it looks for a cloud save, the list goes on and on.
Regarding the no-spinners: I think it is the wrong approach to argue that just because you have data locally, you don't need any spinners.
Whether you need a spinner or not should be decided by the User Experience (e.g., when the user has to wait for more than 100ms, show a spinner), and not by the location of the data. I am a big fan of local-first apps and enjoy building them myself. However, sometimes your app takes a moment to load. With local-first, you eliminate the network as a source of delays, but there are other factors as well, such as large data sets or complex algorithms.
For example, when you have a project planning software and want to plan 100 work packages with multiple resource combinations in an optimal way, depending on the algorithm, this can take some time. In that case, a spinner or a progress bar is a good thing.
Agreed. No loading spinners is a good goal, but processing spinners might be unavoidable.
I didn’t get the impression that the author is advocating for removing spinners as a UI concept, rather it’s just being used a shorthand for, “you should not need to send and load the data to and from elsewhere while you are working.”
Agreed, my comment was meant to provoke exactly that conclusion ;-)
A properly designed app would leverage multi threading to place any long running jobs in the background, allowing the user to carry on with other tasks.
Spinners should not exist in a local first app.
You are aware that 'local-first' does not mean 'no-network'. Having a sync mechanism that runs in the background without user notification can be quite disconcerting.
I mean, I did it, I built an app with a transparent background sync. Then I added a special page, 'sync center'.
In reality, mobile devices don't always have perfect network connections. Therefore, when the user is unsure whether the device is in sync or if the sync is in progress but encounters an obstacle, they might perceive the app as unreliable.
Banning spinners is dogmatic, not user-centric.
Skimming the article, it seems to touch on a lot of the right points, but the motivating first paragraph seems weak:
> Cloud apps like Google Docs and Trello are popular because they enable real-time collaboration with colleagues, and they make it easy for us to access our work from all of our devices. However, by centralizing data storage on servers, cloud apps also take away ownership and agency from users. If a service shuts down, the software stops functioning, and data created with that software is lost.
"Apple pie might be tasty and nutritious and exactly what you want, but, theoretically, apple pie could burst into flames someday, and take your favorite pie-eating bib with it.
100%! Not only local-first. But also private, zero/minimal dependency, open source and environment agnostic!
If there is anyone interested in working on such projects - let's talk! We can't leave our future to greedy surveillance zealots.
> Local-first apps, on the other hand, have better privacy and security built in at the core.
I love this article, but the section on security raised a lot of questions. What's the model for authorizing access to documents for collaboration? How do you managed keys safely for encrypted data? How do users recover "lost" keys?
Cloud computing models have a lot of security mechanisms built-in. You might not like the model (AWS IAM for example) but at least there's a foundation already in place.
Shamir's Secret sharing allows one to store secret keys within m of n number of your peer group (where m is less than n), thus you can hand out an encrypted secret with 5 of your friends and reproduce the secret by getting 3 of them to respond back to you with the secret at a later date. None of the peers will be able to reproduce the secret by themselves.
There are other options for key storage, revoking group privileges, etc. It's an extensive topic, but the foundation is there, it just depends on your network and use cases.
Goal #2, your data is not trapped in a single device is the hard bit, especially with goal #3, the network is optional. For #2 to be true, this means the network is *not* optional for the developer, it is required. Thus the entire complexity of building a distributed app, especially one without a centralized server, which is particularly difficult even with modern local first database tools, greatly increases the complexity of writing this type of software compared to either traditional desktop apps or cloud apps.
Tried to adopt this last month at work, it failed. E.g. the mentioned Automerge, it has poor docs https://automerge.org/docs/reference/library_initialization/... and that left out a lot of question, it seems backend agnostic but have to figure out how to store, how to broadcast ourselves.
yeah I tried to build a project on Automerge but I ended up switching to Yjs, it seems more mature.
100% agree! I built Paisley (because it is the opposite of plaid), to host your personal finances locally and is 100% open source. Paisley pulls data from your financial institutions by scraping balances and importing CSV exports, storing everything locally in a simple SQLite database.
https://github.com/patrickcollins12/paisley
Didn't this already happen? The internet died 20 years ago. Now it is just ‘somewhat’ interconnected intranets with their own local legislation?
Awesome to see this getting more coverage. I am very interested in local first and I am working on several progressive web apps based around this. One app depends on file sync, not database sync and the best I have found is remoteStorage.js. Its not perfect, but its very much the missing piece I was often looking for.
I love this idea of local-first software, but from a business point of view there's unfortunately no current incentive to adopt it since it's nowhere near as profitable compared to SaaS. That, in my opinion, is the biggest bottleneck right now to this getting worldwide adoption
Nextcloud with a few addons - all open source - gets you feature parity with all of that lot.
NC itself gets you file sync and webdav etc. An add on gets you the webby version of LibreOffice. You can bolt on AI addons to classify and tag your images/photos and with a bit more effort, your docs too.
It's properly local first.
With crdt implementations like y.js, writing your own synchronization engine is trivial: https://greenvitriol.com/posts/sync-engine-for-everyone
In a world of owning nothing and paying subscriptions for everything, owning your data and using software that is either yours or libre is 'rebellion' to many a service provider.
Its not local-first or some sort of cloud diet trend, it should be the norm.
Right. I don't even understand why this article had to be this verbose. It's not like we need to be "convinced" that local is better. Everybody who values privacy and independence knows already. But this stuff is unimplementable -- we suffer from the cloud disease because it's immensely profitable for the cloud providers and cloud-based app providers to enslave us, and to bleed us out. Their whole point is locking us in.
"Sharing models" are totally irrelevant until the concept of self-determination is tolerated by the powerful (and they will never tolerate it). Accessing my data from multiple devices is totally secondary; I don't trust mobile endpoints to access my remote data in the first place.
How about redundancy in general. Not local first, not cloud first, but "anything can be first and last". That's how the "cloud" works in the first place. Redundancy. Mesh networks as well.
People's personal computers, even their tablets and phones are so powerful, they can fulfill most use cases (except AI), especially if the application is reasonably efficient.
Synchronize execution. Not data. https://multisynq.io Synchronization of the data is implicit. NO centralized anything.
AIs like GPT being non-local is one of my biggest issues with it.
Offline-first, now with CRDTs, and a brand new name!
One thing I’m personally excited about is the democratization of software via LLMs.
Unfortunately, if you go to ChatGPT and ask it to build a website/app, it immediately points the unknowing user towards a bunch of cloud-based tools like Fly.io, Firebase, Supabase, etc.
Getting a user to install a local DB and a service to run their app (god forbid, updating said service), is a challenge that’s complex, even for developers (hence the prevalence of containers).
It will take some time (i.e. pre-training runs), but this is a future I believe is worth fighting for.
> Unfortunately, if you go to ChatGPT and ask it to build a website/app, it immediately points the unknowing user towards a bunch of cloud-based tools like Fly.io, Firebase, Supabase, etc.
Not sure where your experience is coming from but when I asked an LLM, Claude to be more precise, it referred me to local options first, such as SQLite. It didn't consider cloud platforms at all until I had asked, presumably because it can understand local code and data (it can query it directly and get back results) but cannot understand the context of what's in the cloud unless you configure it properly and give it the env variables to query said data.
What was your prompt?
In my experience it’s great at utilizing local storage and SQLite, if you ask it to.
I just asked the ChatGPT web client (4o, as that’s what most non-developers might default to):
> Can you build me a website for my photos
And it immediately started suggesting Wordpress, Wix, Squarespace, etc.
Specifically, this was section 4 of the various questions it asked me:
> 4. Tech Preference (optional)
> - Do you want this as a static HTML site, WordPress, or built with something like React, Next.js, or Wix/Squarespace? > - Do you need help hosting it (e.g., using Netlify, Vercel, or shared hosting)?
As a non-programmer, I likely wouldn’t understand half those words, and the section is marked optional.
If I follow the “default path” I’m quickly forking over a credit card and uploading my pictures of dogs/family/rocks to the cloud.
Local LLMs are even more amazing in concept, all of the world's knowledge and someone to guide you through learning it without needing anything but electricity (and a hilariously expensive inference rig) to run it.
I would be surprised if in a decade we won't have local models that are an order of magnitude better than current cloud offerings while being smaller and faster, and affordable ASICs to run them. That'll be the first real challenger to the internet's current position as "the" place for everything. The more the web gets enshittified and commercialized and ad-ridden, the more people will flock to this sort of option.
The speed alone is sufficient for a local-first approach. The latency of any cloud software I’ve ever used is like constant sand in the gears of thinking. Although taking supplements that slow my thinking—essentially natural downers—do improve my experience with such software, the improved experience comes at the expense of IQ. Basically, you need to be a little slow and dumb for the software to work as intended.
This is nuts. Computers are supposed to enhance and enable thinking, not make you stupid. In this sense, cloud software is possibly the biggest fraud ever perpetrated on the paying, computer-using public.
For the love of God, please bring back my late 1990s and early 2000s brain-boosting computer experience.
That was published 6 years ago. What's the state of the art of local-first software technology in 2025?
One compromise could be to host the software but also offer the option for self hosting.
Local-first apps should not need hosting.
I mean if local first isn’t possible. I’m more comfortable with web based development plus don’t you need to apply for all kinds of certificates to be “allowed” to run on windows and Mac these days?
3 replies →
Databases like Postgres can be run locally or as part of some kind of managed service in the cloud. Anyone know of recent stats that show the percentage of databases that are managed locally vs by some cloud service?
That’s why I’m working on https://collate.one - offline AI workspace
yes but think of all those poor shareholders with unmaximized value you heartless man!
[dead]
[dead]
[dead]
Local first is almost equates to both privacy protective and public software good.
Essentially antithetical to capitalism, especially America's toxic late stage subscription based enshittification.
Which means its typically a labor of love or a government org has a long term understanding of Software as a Infrastructure (as opposed to SaaS)
"Local first" is neither equivalent to privacy protection or public software good. Many businesses sell local-first software that still contains remote backdoors[0] you cannot control. And it most certainly doesn't ensure "public software good" when there is zero obligation to improve the upstream or empower users to seek alternatives.
I would sooner trust a GPL-licensed remote software program than store a kilobyte of personally identifying information in a proprietary "local first" system.
[0] https://www.macrumors.com/2023/12/06/apple-governments-surve...
I think you mean antithetical to corrupted conflict-of-interest capitalism.
Conflict-of-interest transactions have hidden or coercive impact, lined up in favor of the party with stronger leverage. Examples include un-asked and unwanted surveillance of data or activity, coercive use of information, vendor lock in, unwanted artificial product/service dependencies, insertion of unwanted interaction (ads), ...
None of that is inherent to capitalism. They clearly violate the spirit of capitalism, free trade, etc.
It is providers taking advantage of customer lack of leverage and knowledge to extract value that does not reflect the plain transaction actually desired by customers. Done legally but often with surreptitious disclosure or dark pattern permissions, border line legally where customers would incur great costs identify and protest, or plain old illegally but in a hidden manner with a massive legal budget to provide a moat against accountability.
It is tragic that the current generation of Silicon Valley and VC firms have embraced conflict of interest based business models. Due to the amounts of money that scaling "small" conflicts can make. Despite the great damage that we now know scaling up "small" conflicts can do.
That was not always the case.
The problem with our current system of capitalism is that it causes capitalism to accumulate. This leads to less competition, fewer checks and balances, and undermines the whole "wisdom of the crowd" mechanism that captialism is premised on.
If we want a functioning market based system then we need to explicitly correct for this by aggressively taxing the wealthiest entities (individuals and companies) in our society to bring things closer to a level playing field.
"corrupted conflict-of-interest capitalism" is just capitalism.
Free trade is antithetical to capitalism. Free trade means everyone is on a level playing field, but capitalism means those with more capital are above the rest. These are obviously not compatible.
It might be antithetical to rent seeking at best, but capitalism?