← Back to context

Comment by IshKebab

14 hours ago

Not really. Git does use delta-based storage for binary files. It might not be as good as it could be for some files (e.g. compressed ones) but that's relatively easy to solve.

The real problem is that Git wants you to have a full copy of all files that have ever existed in the repo. As soon as you add a large file to a repo it's there forever and can basically never be removed. If you keep editing it you'll build up lots more permanent data in the repo.

Git is really missing:

1. A way to delete old data.

2. A way for the repo to indicate which data is probably not needed (old large binaries).

3. A way to serve large files efficiently (from a CDN).

Some of these can sort of be done, but it's super janky. You have to proactively add confusing flags etc.