Comment by sublinear
2 days ago
May I humbly suggest that those files probably belong in an LFS submodule called "assets" or "vendor"?
Then you can clone without checking out all the unnecessary large files to get a working build, This also helps on the legal side to correctly license your repos.
I'm struggling to see how this is a problem with git and not just antipatterns that arise from badly organized projects.
The problem I've run into with this is that those files stay in the history. Your git clones will get ridiculous, and you'll blast through any git repo size limits that you might have.
I just want my files to match what's expected when I pull a commit, that doesn't require some literal "commit build system" and "pull build system". Coming from perforce and SVN, I can't comprehend why git it so popular, beyond cargo cult. It's completely nonsensical to think that software is just source.
If the .git subdir of your LFS submodule is really so large that a --no-checkout clone is unreasonable then you can either prune it down to a more recent date range or squash down commits before pushing and keep your more detailed history local.
The point of a clone without checkout is so you can pick the files you really need. This is not a trivial detail that should be automated away. If the user doesn't know which files matter that's an organizational and project management problem, not a git problem. Developers should absolutely care about such details.
The usual workflow is that you develop a feature branch and only push when you have final changes ready to merge and described by a single or just a few commits so that the history isn't a mess. Someone should be reviewing your changes after all. Git is for versioning, not nightly backups. It's also generally preferred that you commit your project flies for your tools, not the binaries from the build output.
The source is always more important because the goal is to understand and review the changes to the project, not just dump everything in there and make managing the project a nightmare. Your use case sounds unusual and likely caused by many binary file changes being committed frequently. You might be correct that git isn't the right tool for that, but it never claimed to be. This doesn't mean git is a bad tool, but that your workflow doesn't seem to care about reviewing every detail before merge.
The user shouldn't have to think about such a thing. Version control should handle everything automatically and not force the user into doing extra work to workaround issues.
I always hated the “write your code like the next maintainer is a psychopath” mantra because it makes the goal unclear. I prefer the following:
Write your code/tools as if they will be used at 2:00 am while the server room is on fire. Because sooner or later they will be.
A lot of our processes are used like emergency procedures. Emergency procedures are meant to be brainless as much as possible. So you can reserve the rest of your capacity for the actual problem. My version essentially calls out Kernighan’s Law.
Organizing your files sensibly is not necessary to use LFS nor is it a "workaround". It's just a pattern I am suggesting to make life easier regardless of what tools you decide to use. I can't think of a case where organizing your project to fail gracefully is a bad idea.
Git does the responsible thing and lets the user determine how to proceed with the mess they've made.
I must say I'm increasingly suspicious of the hate that git receives these days.