Comment by reactordev
2 days ago
yeah, this isn't really solving the problem. It's just punting it. While I welcome a short-circuit filter, I see dragons ahead. Dependencies. Assets. Models... won't benefit at all as these repos need the large files - hence why there are large files.
There seems to be a misunderstanding. The --filter option simple doesn't populate content in the .git directory which is not required for the checkout. If there is a file that is large which is needed for the current checkout (ie the parts not in the .git folder), it will be fetched regardless of the filter option.
To put it another way, regardless of what max size you give to --filter, you will end up with a complete git checkout, no missing files.
It’s definitely not a full solution, but it seems like it would solve cases where having the full history of the large files available, just not on everyone’s machine, is the desired behavior.