Comment by okanat
7 hours ago
The replacement has already happened. It is HTTP and JSON for 99% of the software developed today. The reason C stayed has multiple reasons but most obvious ones are for me are:
- People just stopped caring about operating systems research and systems programming after ~2005. Actual engineering implementations of the concepts largely stopped after the second half of 90s. Most developers moved on to making websites or applications in higher level programming languages.
- C hit a perfect balance of being a small enough language to grok, being indepedent of the system manufacturers, reflecting the computer architecture of 80s, actually small in syntax and code length and quite easy to implement compilers for. This caused lots of legacy software being built into the infrastructure that gave birth to the current contemporary popular OSes and more importantly the infrastructure of the Internet. Add in .com bubbles and other crises, we basically have/had zero economic incentive to replace those systems.
- Culture changed. We cared more about stability, repairability and reusability. Computers were expensive. So are programmers and software. Now computers are cheap. Our culture is more consumerist than ever. The mentality of "move fast and break things" permeated so well with economic policy and the zeitgeist. With AI it will get worse. So trying to make a real alternative to C (as a generic low level OS protocol) has reduced cultural value / optics. It doesn't fill the CVs as well.
It doesn't mean that people haven't tried or even succeeded. Android was successful in multiple fronts in replacing C. Its "intents" and low level interface description language for hardware interfaces are great replacement for C ABI. Windows' COM is also a good replacement that gets rid of language dependence. There are still newer OSes try like Redox or Fuchsia.
I'd never thought I'd see the day that anyone praises COM.
[delayed]
As idea it is great, as the tooling available in C++ Builder, Delphi, VB 6, C++/CX (WinRT is basically COM with extras) also great.
Using it from MFC, kind of alright.
Using it from .NET, depends if Framework, .NET Native, or modern, with various levels of usability.
Using it from ATL, WRL, C++/WinRT, is a mess unfortunely.
> People just stopped caring about operating systems research and systems programming after ~2005.
and so it was that after that date, all development of
came to a grinding halt, and no further work was done.
On Windows, macOS and Android, most of that development on that list is done in C++, not C.
> It doesn't mean that people haven't tried or even succeeded. Android was successful in multiple fronts in replacing C. Its "intents" and low level interface description language for hardware interfaces are great replacement for C ABI. Windows' COM is also a good replacement that gets rid of language dependence. There are still newer OSes try like Redox or Fuchsia.
I am not sure I buy this from a system perspective, especially when taking this[1] into consideration.
______
1. Alexis King's reply to "Why do common Rust packages depend on C code?". Link: https://langdev.stackexchange.com/a/3237
Name me a stable binary interface that is not ints and arrays of ints
COM, WinRT, XPC, AIDL.
Now you can move the goal posts and assert that any data serialized into a memory buffer is an array of ints.
In pieces of my code, I need to call setuserid() to manage some of the security that I designed in 2010.
There was no Rust at that point, and I used the most basic tool that could do it.
Could I have done this in Java with gymnastics of JNI, linking C into the JRE?
Definite maybe.
Yes, nowadays with Panama, and before Rust was around, JNA was already there so using JNI wasn't strictly necessary.
>Culture changed. We cared more about stability, repairability and reusability. Computers were expensive. So are programmers and software. Now computers are cheap. Our culture is more consumerist than ever. The mentality of "move fast and break things" permeated so well with economic policy and the zeitgeist. With AI it will get worse. So trying to make a real alternative to C (as a generic low level OS protocol) has reduced cultural value / optics. It doesn't fill the CVs as well.
IMO I do see this changing in the future as higher power computers become expensive once again, and I'm not just referring to the recent chip shortage.