← Back to context

Comment by embedding-shape

6 hours ago

> I use dir trees for code ofc, but everything else is flat in my ~/Documents.

Which is great, but on all major OSes you'd eventually hit performance issues with flat directories like this. Might not be an issue in month one, or even year one, but after 10 years of note taking/journaling that approach will show the issue with large flat directories.

So eventually you'd need to shard it somehow, so might as well start categorizing/sorting things from the get go, at least in some broad major categories at least, because doing so once you already have 10K entries in a directory, it sucks big time to do it.

If it's just performance, cd ~/Documents && mkdir old && mv ./* old/ (or today's date instead of old). I actually have that layout on one PC.

If real organization is needed, seems like that'd be easier in hindsight than having foresight

  • So then you have one intentionally slow directory ("old/" in this case) and one fast directory?

    Personally I'd categorize stuff, but you do you, there really isn't any wrong way to do it, if it works it works :)

    • I meant you mv into old before it gets too big. I've never actually seen a dir get slow like this. Only seen that with programmatic things like making 1M json files.