← Back to context

Comment by kaybe

14 days ago

Oh ya I remember how some computer pulled a windows update over a satellite connection during a research flight (aircraft). That was super expensive, wow. Now Microsoft servers are banned at the outgoing point since you couldn’t reliably stop it the computer itself and new teams with new computers come in.

I'm not letting Microsoft off the hook here, but if you have an expensive metered connection and you're trusting clients (especially a modern personal computer of any operating system type)to play nicely with bandwidth, that's 100% on you.

  • That's a really sorry state of things, then. There's zero trust in software now, in the literal sense. How did it get that we live in a world where you can't trust a client to enforce its own documented behavior? How did it get to be the user's fault for not using OS and hardware level measures and not the software vendor's fault when the "Automatic updates" toggle is a no-op?

    • MBAs/consultants hijacked the industry along with an influx of people that only consider leetcode to be sufficient for hiring. The past 10 years has been a major injunction of these people into big tech. The resulting mess is predictable, it'll get worse too which is why we need to break up these companies and allow better more efficient companies to take their place rather than letting them subsidize their failures with their monopolies.

      1 reply →

    • In an environment where bandwidth utilization costs money I think it's a good belt-and-suspenders approach, regardless of the expected behavior of the clients, to enforce policy at the choke point between expensive and not-expensive.

      (I think more networks should be built with default deny egress policies, personally. It would make data exfiltration more difficult, would make ML algorithms monitoring traffic flows have less "noise" to look thru, and would likely encourage some efficiency on the part of dependencies.)

    • Software design is not really my wheelhouse so I can't comment meaningfully on that, but on the networking side I can very confidently say it was a poor architecture. You simply cannot assume that all of your clients are going to be both 1) non-malicious and 2) work exactly as you think they will.

      Link saturation would be one of the first things that would come to mind in this situation, and at these speeds QoS would be trivial even for cheap consumer hardware.

      1 reply →

    • > How did it get that we live in a world where you can't trust a client to enforce its own documented behavior?

      My guess a combo of economic incentives and weak legal protections.

      I realize that answer applies to so many issues as to be almost not worth saving, but I think it's still true here.

    • Fair enough, but the fact is that until fairly recently most software wouldn't even pretend to care about conserving bandwidth. I certainly would never expect a desktop OS to do this well, even if MS loves their revenue-generating "bugs."

  • How do you mean that? In my Linux Laptops updates never happen unless I trigger them and nothing really changes even years after the last installation. You could boot it up and use whats there and just never update.

> since you couldn’t reliably stop it the computer itself and new teams with new computers come in.

Wifi connection settings in Windows have a "metered connection" setting, which disables automatically downloading updates. I don't recall exactly when this was introduced, but I had to use it for a year while I was stuck on satellite internet. You can even set data caps and such.

Of course, it's always off by default, and I have no idea if there's any way to provision the connection via enterprise admin to default to on for a particular network (I would assume not) so you'd be stuck hoping everyone that comes in does the right thing.

  • It's a good setting. I've found it gets reset sometimes from Windows updates, so you must remain vigilant.