Comment by 0xbadcafebee

1 day ago

This is basically unethical. Imagine anything important in the world that worked this way. "Do nuclear engineering the easy way while it works, and when it stops working, fix the problem."

Software engineers always make the excuse that what they're making now is unimportant, so who cares? But then everything gets built on top of that unimportant thing, and one day the world crashes down. Worse, "fixing the problem" becomes near impossible, because now everything depends on it.

But really the reason not to do it, is there's no need to. There are plenty of other solutions than using Git that work as well or better without all the pitfalls. The lazy engineer picks bad solutions not because it's necessarily easier than the alternatives, but because it's the path of least resistance for themselves.

Not only is this not better, it's often actively worse. But this is excused by the same culture that gave us "move fast and break things". All you have to do is use any modern software to see how that worked out. Slow bug-riddled garbage that we're all now addicted to.

Most of the world does work this way. Problems are solved within certain conditions and for use over a certain time frame. Once those change, the problem gets revisited.

Most software gets to take it to more of an extreme then many engineering fields since there isn't physical danger. Its telling that the counter examples always use the potentially dangerous problems like medicine or nuclear engineering. The software in those fields are more stringent.

On the other hand, GitHub wants to be the place you choose to build your registry for a new project, and they are clearly on board with the idea given that they help massive projects like Nix packages instead of kicking them off.

As opposed to something like using a flock of free blogger.com blogs to host media for an offsite project.

What is wrong with you? You berated and name-called open source volunteers because a blog post taught you that package managers using Git are "bad." Let me be clear: a 3 minute read of a blog post offers neither moral superiority nor technical insights that surpass those of actual maintainers.

Contrary to the snap conclusion you drew from the article, there are design trade-offs involved when it comes to package managers using Git. The article's favored solution advocates for databases, which in practice, makes the package repository a centralized black box that compromises package reproducibility. It may solve some problems, but still sucks harder in some ways.

The article is also flat-out wrong regarding Nixpkgs. The primary distribution method for Nixpkgs has always been tarballs, not Git. Although the article has attempted to backpedal [1], it hasn't entirely done so. It's now effectively criticizing collaboration over Git while vaguely suggesting that maybe it’s a GitHub problem. And you think what, that collaboration over Git is "unethical"???

On one side, there are open-source maintainers contributing their time and effort as volunteers. On the other, there are people like you attacking them, labeling them "lazy" and bemoaning that you're "forced" to rely on the results of their free labor, which you deride as "slow, bug-riddled garbage" without any real understanding. I know whose side I'm on.

[1]: https://github.com/andrew/nesbitt.io/commit/8e1c21d96f4e7b3c...

Fixing problems as they appear is unethical? Ok then.

You realize, there are people who think differently? Some people would argue that if you keep working on problems you don't have but might have, you end up never finishing anything.

It's a matter of striking a balance, and I think you're way on one end of the spectrum. The vast majority of people using Julia aren't building nuclear plants.

  • Fixing problems when they appear is ethical.

    Refusing to fix a problem that hasn't appeared yet, but has been/can be foreseen - that's different. I personally wouldn't call it unethical, but I'd consider it a negative.

    • The problem is that popularity is governed by power laws.

      Literally anybody could forsee that, _if_ something scales to millions of users, there will be issues. Some of the people who forsee that could even fix it. But they might spend their time optimizing for something that will never hit 1000 users.

      Also, the problems discussed here are not that things don't work, it's that they get slow and consume too many resources.

      So there is certainly an optimal time to fix such problems, which is, yes, OK, _before_ things get _too_ slow and consume _too_ many resources, but is most assuredly _after_ you have a couple of thousand users.

Hold up... "lazy engineers" are the problem here? What about a society that insists on shoving the work product of unfunded, volunteer engineers into critical infrastructure because they don't want to pay what it costs to do things the right way? Imagine building a nuclear power plant with an army of volunteer nuclear engineers.

It cannot be the case that software engineers are labelled lazy for not building the at-scale solution to start with, but at the same time everyone wants to use their work, and there are next to no resources for said engineer to actually build the at scale solution.

> the path of least resistance for themselves.

Yeah because they're investing their own personal time and money, so of course they're going to take the path that is of least resistance for them. If society feels that's "unethical", maybe pony up the cash because you all still want to rely on their work product they are giving out for free.

  • > If society feels that's "unethical", maybe pony up the cash because you all still want to rely on their work product they are giving out for free.

    I like OSS and everything.

    Having said that, ethically, should society be paying for these? Maybe that is what should happen. In some places, we have programs to help artists. Should we have the same for software?