← Back to context

Comment by throwaway38583

3 years ago

Congratulations to the author. Things like this are why I hope Carbon exists. Evolving c++ seems like a dumpster fire, despite whatever compelling arguments about comparability you are going to drop on me.

The issue is that a lot of people just think about languages in a wrong way, which is the whole reason for pointless things like C++ expansions, Carbon, Rust, and stuff like this.

One of the fundamental ideas that people run with in language creation/expansion is "programmer is stupid and/or make mistakes" -> "lets add language features that intercept and control his stupidity/mistakes".

And there is a very valid reason for this - it allows programmers of lesser skill and knowledge base to pick up codebases and develop safe software, which has economic advantages in being able to higher less experienced devs to write software at lower salary points and spend next to no time fixing segfault issues due to complex memory management. The whole reason Java got so popular over C++ was because of its GC - both C++ and Java supported fairly strong typing with classes, but C++ still had a lot of semantics around memory management that had to be taken care of, whereas with Java you just simply don't do anything.

However, people are applying this idea towards lower level languages, because they want the high performance of a compiled language with a whole bunch of features that make writing code as mistake free as possible. And my challenge to that is this - why not spend the time making just smarter compilers/tooling?

Think about a hypothetical case where Rust gets all the features added to it that people want, and is widely used as the main language over all others. Looking at all the code bases, there will be a lot of common use patterns, a lot of the safety code duplicated over and over in predictable patterns, e.t.c. And you will see these common things added to Rust. Just like with Java a lot of the predictable use patterns got abstracted into widely used libraries like Lombok, Spring, e.t.c, where you don't have to worry about correctness in lieu of using a library. And you essentially will start to move towards more and more stuff being handled for you automagically, which is all part of the compiler/toolchain

In the same way, #embed can be solved by smart compiler. Have a static string that opens a file, and read contents into a buffer that doesn't change? Auto include that file in a binary if you want to target performance rather than executable size. No need for special instruction, just be smart about how you handle an open call, and leave the fine tuning of this to specific compiler options.

And from an economic perspective of ease of use from above, you would have a language like Python which is super easy to pick up and program in, except instead of the interpreter, you would have a compiler that will spit out binaries. Python is already widely adopted primarily of how easy it is to set up and use. Now imagine if you had the option to run a super smart compiler that highlights any potential issues that come with dynamic typing because it understands what you are trying to do, fixes any that it can, and once everything is addressed, it spits out an optimized memory safe executable. With Rust, you code, compile, see you made a mistake somewhere with a reference, fix it, repeat. With this, you would code, compile, fix the mistake somewhere that the compiler warns you about, repeat. No difference.

Focusing on the toolchain also lets you think about integrating features from languages like Coq with provability, where you can focus not only on correctness processing/memory wise, but also "is the output actually correct". I.e, any piece of code for all given input can be specified to have guaranteed bounded output set, which you can integrate into IDE tools to provide you real time feedback on this for you to design the code in a way that avoids things like URL parsing mistakes, which all the languages safety features of Rust won't catch.

As for C, you leave it a version that has a stable, robust ABI, and then anything that you need to support will be delegated to custom tools. That way, in the future where compute will likely be full of specialized ML chips, instead of worrying about writing the frontend to support every feature, you quickly get a notional tool chain made and are able to run existing C code.