← Back to context

Comment by golergka

3 days ago

It takes less than an hour on my third world apartment wifi to download Call of Duty Modern Warfare remake which is over 200 gygabytes. Since we're not talking about remote work here, I think Microsoft offices and servers (probably on local network) might have managed similar bandwidth back then.

There is a lot more to it than that. Check out "The largest Git repo on the planet" by Brian Harry who was in charge of the git migration and Azure DevOps (Microsoft's pendant to GitHub)

https://devblogs.microsoft.com/bharry/the-largest-git-repo-o...

  • > For context, if we tried this with “vanilla Git”, before we started our work, many of the commands would take 30 minutes up to hours and a few would never complete. The fact that most of them are less than 20 seconds is a huge step but it still sucks if you have to wait 10-15 seconds for everything. When we first rolled it out, the results were much better. That’s been one of our key learnings. If you read my post that introduced GVFS, you’ll see I talked about how we did work in Git and GVFS to change many operations from being proportional to the number of files in the repo to instead be proportional to the number of files “read”. It turns out that, over time, engineers crawl across the code base and touch more and more stuff leading to a problem we call “over hydration”. Basically, you end up with a bunch of files that were touched at some point but aren’t really used any longer and certainly never modified. This leads to a gradual degradation in performance. Individuals can “clean up” their enlistment but that’s a hassle and people don’t, so the system gets slower and slower.

    Great quote from him here.