Comment by DataDaoDe
9 days ago
Yes a thousand percent! I'm working on this too. I'm sick of everyone trying to come up with a use case to get all my data in everyone's cloud so I have to pay a subscription fee to just make things work. I'm working on a fitness tracking app right now that will use the sublime model - just buy it, get updates for X years, sync with all your devices and use it forever. If you want updates after X years buy the newest version again. If its good enough as is - and that's the goal - just keep using it forever.
This is the model I want from 90% of the software out there, just give me a reasonable price to buy it, make the product good, and don't marry it to the cloud so much that its unusable w/out it.
There are also a lot of added benefits to this model in general beyond the data privacy (most are mentioned in the article), but not all the problems are solved here. This is a big space that still needs a lot of tooling to make things really easy going but the tech to do it is there.
Finally, the best part (IMHO) about local-first software is it brings back a much healthier incentive structure - you're not monetizing via ads or tracking users or maxing "engagement" - you're just building a product and getting paid for how good it is. To me it feels like its software that actually serves the user.
Obsidian the note taking app is a great model to follow as well. The client is completely free and they sell an optional syncing service. The notes are all on markdown files so the client is completely optional.
This is the reason I have always refused to use Bear note taking app irrespective of how good and snappy that app is. Because they keep their notes in a SQLite db now and even though that file can be backed up and handled locally my notes are not easily accessible to me. I can't easily edit my notes in other editors (which I often like to do on my mac), I can't version controlled backup and sync those files the way I want outside of iCloud (which is what Bear uses).
What is sad is that they used to be local files first note app and then they moved to sqlite citing some sync and performance issues.
> What is sad is that they used to be local files first note app and then they moved to sqlite citing some sync and performance issues.
They're still a local-first note application. It's just slightly harder for you to edit your notes externally, and not even by that much - it's very easy to directly query (read and write) SQLite databases, and if you really cared you could have made a script to grab a note, export it to a temporary text file, allow you to edit it, then update the SQLite database.
> I can't version controlled backup and sync those files
You absolutely can - you can dump SQLite databases to text files that contain SQL queries that will restore the database that you can then backup and sync: https://stackoverflow.com/questions/75675/how-to-dump-the-da...
> then they moved to sqlite citing some sync and performance issues
Yes, that's because "plain text" files are bad for performance and harder to sync correctly. For people who (1) have over a hundred thousand notes they want to keep (like me) and (2) want maximum confidence that they're not going to lose years worth of work, that's incredibly important.
The devs made the right choice. You can always write scripts to interface with a SQLite database with an external editor. You can't take plain text files and magically make them as fast and durable as a database.
6 replies →
Sqlite is still local first. Couldn’t they just also provide you with notes via obsidian-like file folder structure while using sqlite for in-app performance?
1 reply →
I didn’t know they did this change which means it’s time to think about migrating away from bear. Which is a pity because the software itself is rock solid
5 replies →
the syncing is just really godawfully slow. so much so that after 2 years of use i have almost stopped taking notes.
How do you plan to do the syncing without some sort of cloud infrastructure?
The benefit of local-first means you’re not incentivized to sell your cloud offering, so you can just give options. Sync with iCloud, Google drive, OneDrive, Dropbox, Mega, SMB, SFTP, FTP, whatever you feel like adding support for. And since local-first usually means having some kind of sane file format, you can let “advanced” users manage their own files and synchronization like people have been doing for the last 50 years.
There are a lot of valid answers to this! One is to use your platform's provided one, like OneDrive or iCloud. Another is to integrate with some other sync platform. Dropbox is a popular target for this. Peer-to-peer is another, although that obviously also come with limitations. Finally, bring-your-own-sync is a popular choice amongst open-source apps, where you provide a self-hostable sync server.
Note that this thread is full of people claiming that using SQLite with iCloud sync is evidence of some conspiracy theory or other!
Check out Aardvark (renamed to reflection) it's a collaborative note-taking app from the GNOME folks. I think the idea isn't to completely remove cloud infrastructure, but to at least make it optional and/or provide alternatives. For example, this note app works via P2P. blogs.gnome.org/tbernard/2025/06/30/aardvark-summer-2025-update/
For Joplin I use WebDav from the 10gb of free file storage that comes with Fastmail. So I have easy sync with multiple platforms and form factors, and even substantial notes make little dent in the allowance.
Something like Syncthing, perhaps?
Anyone know of any mobile apps that have done this and bundled their own fork of syncthing under the hood for syncing?
1 reply →
Ideally, you would use existing commodity infrastructure but we have found none of it is really super fit for our purposes. Failing that, we have been developing an approach to low-maintenance reusable infrastructure. For now, I would advise running your own but positioning yourself to take advantage of commodity systems as they emerge.
Syncthing
There's a git plugin.
You can use FTP and SVN.
Both of those require a server
right now its in webrtc
[dead]
> get updates for X years, sync with all your devices and use it forever. If you want updates after X years buy the newest version again. If its good enough as is - and that's the goal - just keep using it forever.
While this sounds good deal, with this approach
- You have to charge total cost of subscription at once (1y or 2y),
- Still have to keep servers running for syncing, also you have think about cases where user syncing 1y of data in a single day.
- Have to keep people on the payroll for future developments.
(You are here thinking only in developer perspective.)
You don't have to keep servers running if there aren't servers (p2p) or you offload it onto some other cloud.
Had similar thoughts a few years back (https://rodyne.com/?p=1439) when considering worst case scenarios after a local factory lost two days production due to a server failure at an IT supplier.
A backend can be part of the functionality though, such as for real-time collaboration and syncing. But you can have ownership and longevity guarantees for both the data and the service as long as you can eject [1] from the cloud and switch to self-host or back at any time, which is what we do for our notes/tasks IDE
[1] https://thymer.com/local-first-ejectable
Totally agree. If you don't mind - what tech stack are you using for your fitness tracking app? I'm particularly curious about how you handle cross-device sync :)
[flagged]
your comment is insane imo. some people talk that way in real life. it’s not their fault LLM’s were invented.
3 replies →
haha, 100% real. not a native speaker though, so sometimes i catch myself mimicking that LLM tone from all the exposure to them. appreciate the feedback, will definitely work on developing a more authentic writing style from now on.
but yeah, you're right to be concerned, i'm seeing bots everywhere too.
11 replies →
continuing the conversation by asking a question is now an LLM tell on a 4 sentence comment? I'm sorry but that's inane.
They'd have used — not - if they were an AI.
2 replies →
What if you are an old man and more clouds than ever are appearing which deserve a good fist shaking?
Asking for a friend . . .
>you're not monetizing via ads
Yes, you are. You can find tons of purely local apps that monetize themselves with ads.
Sure you could. I'm not, I don't think its in the spirit of local first. And I wouldn't pay money for that, but if you or someone else wants to build that kind of software - its a free world :)
It’s easy to say you wouldn’t do that, but if it gets to the point where you have an employee helping you out and in a downturn you have to choose between laying them off or pushing an ad to keep paying them one more quarter, you might reconsider.
2 replies →
> You can find tons of purely local apps tha[t] monetize themselves with a[d]s.
How do they do that without hitting the internet?
Point 3 from the article is
>3. The network is optional
Ad SDKs usually allow caching ads for a period of time so that ads can still be shown while the device is temporarily offline.
It's "local first", not "local only".
4 replies →
i could be wrong but I think they're referring to the winrar model, where there are occasional "annoyances" that you can either ignore or pay to get rid of.
Bro who wants your pointless fitness data? Not even you care that much for that. Just use a notepad ffs.
Fitness data tells a lot, your health status, your daily schedule, with running/cycling/... your exact whereabouts that is quite some valuable information.
A notepad also isn't enough to correlate heart rate etc to specific exercises and plotting over time
Tell me more. Lol. I just did a 10k run, I tracked it with my watch but tell me how any of that matters to anyone except me (and it doesn't even matter to me what my HR was over that run - though i did use a HRM but mainly to keep myself from over-exertion). I really don't understand what fitness apps are supposed to do, they're possibly the most useless thing ever invented. I wrote my own app in Clojure over a decade ago and used it to track my workouts for a year or two, I never ever go back and look at a workout more than a week ago, maybe 2 weeks at the most, it simply isn't good data, it is the least valuable data one can generate.
3 replies →
> I'm sick of everyone trying to come up with a use case to get all my data in everyone's cloud so I have to pay a subscription fee to just make things work.
AI photo and video generation is impractical to run locally.
ComfyUI and Flux exist, but they serve a tiny sliver of the market with very expensive gamer GPUs. And if you wanted to cater to that market, you'd have to support dozens of different SKUs and deal with Python dependency hell. And even then, proficient ComfyUI users are spending hours experimenting and waiting for renders - it's really only a tool for niche artists with extreme patience, such as the ones who build shows for the Las Vegas Sphere. Not your average graphics designers and filmmakers.
I've been wanting local apps and local compute for a long time, but AI at the edge is just so immature and underpowered that we might see the next category of apps only being available via the cloud. And I suspect that these apps will start taking over and dominating much of software, especially if they save time.
Previously I'd only want to edit photos and videos locally, but the cloud offerings are just too powerful. Local cannot seriously compete.
But who said anything about AI? Lots of local-first apps have nor need any AI whatsoever. And by the way, Topaz Labs has good offerings for editing photos and videos with AI that run locally, works great for many use cases (although it's not fully generative like Veo etc, more like upscaling and denoising, which does use generative AI but not like the former).
Most cloud apps have no need for AI either, but companies are pushing it anyway for bullshit marketing reasons, similar to what they did with blockchain a decade ago.
1 reply →
I suspect that most content will be generated in the future and that generation will dominate the creative fields, white collar work, and most internet usage.
If that's true, it's a substantial upset to the old paradigms of data and computing.
3 replies →
> AI photo and video generation is impractical to run locally.
You think it always will be? What can the new iPhone chips do locally?
Regardless of what hardware capabilities exist, the previous post makes it sound like every application needs AI which is just not true.
> You think it always will be? What can the new iPhone chips do locally?
I suspect we're a decade off from being able to generate Veo 3, Seedance, or Kling 2.1 videos directly on our phones.
This is going to require both new compute paradigms and massively more capable hardware. And by that time who knows what we'll be doing in the data center.
Perhaps the demands of generating real time fully explorable worlds will push more investment into local compute for consumers. Robotics will demand tremendous low latency edge compute, and NVidia has already highlighted it as a major growth and investment opportunity.