← Back to context

Comment by HexDecOctBin

1 day ago

So this filter argument will reduce the repo size when cloning, but how will one reduce the repo size after a long stint of local commits of changing binary assets? Delete the repo and clone again?

It's really not clear which behaviour you want though. For example when you do lots of bisects you probably want to keep everything downloaded locally. If you're just working on new things, you may want to prune the old blobs. This information only exists in your head though.

For lots of local edits you can squash commits using the rebase command with the interactive flag.

yeah, this isn't really solving the problem. It's just punting it. While I welcome a short-circuit filter, I see dragons ahead. Dependencies. Assets. Models... won't benefit at all as these repos need the large files - hence why there are large files.

  • There seems to be a misunderstanding. The --filter option simple doesn't populate content in the .git directory which is not required for the checkout. If there is a file that is large which is needed for the current checkout (ie the parts not in the .git folder), it will be fetched regardless of the filter option.

    To put it another way, regardless of what max size you give to --filter, you will end up with a complete git checkout, no missing files.

  • It’s definitely not a full solution, but it seems like it would solve cases where having the full history of the large files available, just not on everyone’s machine, is the desired behavior.