← Back to context

Comment by chrisco255

2 years ago

I think for me, when I was first learning software development 12 years ago, I had heard about Linux and open source, but I didn't really understand how it operated or organized itself. I had seen Wikipedia appear in the early 00s and understood that distributed groups could develop something better than centralized entities (such as Microsoft's Encarta or Britannica's encyclopedia), but the analogy of a centrally planned cathedral, carefully coordinated, vs the organized chaos of a bazaar, was useful for me in understanding why software development is quite unlike other engineering disciplines, especially once software was augmented with the ability to update itself over the internet.

You can build software like a traditional engineering project, with a chief architect and lead engineer drawing up plans along with a team of people that map out all the specs ahead of time. But the internet changed everything. It made distributed coordination possible, and long-running, complex open source projects that could outlive or outgrow their founding team, became achievable.

But the reality today at least is that that is not how big, successful open source projects get developed. They are very much handled as large engineering projects, with architects and lead engineers for each area or feature at least. You won't get a new feature into the Linux networking stack without discussing and reviewing it with the network maintainers, for example. And ultimately you'll still need Linus' approval if it is a major feature.

It's also important to note that some of the major open source projects are currently, in practice, collaborations of multiple large corporations. At least this is clearly how Linux works (the vast majority of contributions are coming from corporate players such as Microsoft, IBM, Intel etc), Clang, Kubernetes.

Firefox and the GNU suite of tools are probably the largest exceptions, along with a few big projects developed by a single company with few outside contributions (JVM, ElasticSearch, Grafana, etc).

  • > You won't get a new feature into the Linux networking stack without discussing and reviewing it with the network maintainers, for example

    Sure, there are trusted experts in open source development too. Of course you have to get your PR discussed to get it checked in! How would the system be resilient to bugs or hacks or scope creep if not? Of course, you're always welcome to fork it and start your own thing.

    There was a 2017 report on Linux Kernel Dev [1], it states that "Since 2005 and adoption of Git DVCS, 15,637 developers from over 1,400 companies have contributed to the Linux kernel. Since last year, over 4,300 devs from more than 500 companies have contributed to the kernel. Of these, 1,670 contributed for the first time, or about a third of the contributors."

    That's not a centrally controlled and planned engineering project. Microsoft Windows is centrally controlled. Apple's macOS is centrally controlled. What are your odds of getting a kernel feature added to either of those unless you work at those companies? Zero.

    [1] https://www.zdnet.com/article/whos-building-linux-in-2017/

  • Not only are many successful open source projects run as cathedrals, but often it turns out that hunchbacks are in charge of crucial parts of the cathedrals.