Time-Traveling to 1979: Advice for Designing 'C with Classes

8 days ago (coderschmoder.com)

The best advice is probably "don't", as it usually is to most people setting out to design a programming language, and even more so for people setting out to do a mostly backwards compatible extension to a language that isn't suited for what you want it to do.

The second best advice is probably, do just c with classes. Allow defining your own allocator to make objects of those classes. It's fine if objects built with one allocator can only refer to objects built by the same one.

Don't do templates, just do the minimum needed for a container type to know what type it contains, for compile time type checking. If you want to build a function that works on all numbers regardless if they are floats or complex or whatever, don't, or make it work on those classes and interfaces you just invented. A Float is a Number, as is an Integer. Put all that cleverness you'd waste on templates into making the compiler somewhat OK at turning that into machine types.

Very specifically don't make the most prominent use of operator overloading a hack to repurpose the binary left shift operator to mean `write to stream`. People will see that and do the worst things imaginable, and feel good about themselves for being so clever.

I have no doubt that had this happened nothing would have changed. C++'s legacy is that every positive incremental change is implemented at the last possible moment and in the most frustrating and caveat-laden manner.

The individual merits of language features hold relatively little value compared to the sausage making machine that is the C++ language evolution process.

I really am curious why the article goes with just implementing Templates early. If the question is going back from today (or even 2013 as the year for Bjarne giving the question to his class) why would someone recommend templates when typed polymorphic datatypes constructors are a more sound method for implementing ‘generics’ (also easier to produce sensible error messages)?

Also, why go with constexpr as a replacement (which is not as expressive unless I have badly misunderstood how they work) for pre-processor macros. There have been type-safe and sound implementations of macros, along with explicitly staged computations, since the early 2000’s, why would that not be more preferable?

I think the article is a fun thought exercise, but i think it attempts to stick too closely to what C++ has become in our timeline and ignored better alternatives that if explained and implemented at the outset would result in a language that retained the performance and abstraction characteristics of C++ as it is today but would place it on sound foundation for further evolution as the language adapts to changes in the industry at large.

I can't see this title without recommending the move Time After Time, starring Malcolm McDowell and Mary Steenburgen.

The plot is based on the premise that H.G. Wells actually invents a time machine, and it's used by Jack the Ripper to travel to 1979 San Francisco.

What'd I tell to Bjarne:

- In the future, you'll carry in your pocket a computer more powerful than the sum of all computers currently present at the university

- The unchecked flat memory model of C will cause numerous security issues with sometimes grave consequences in the "real world"

- Follow the design of Standard ML (SML) and adapt it to systems programming (yeah, it appeared in 1983, but surely papers have been published before that)

- Do not even think about using unsigned types for sizes and get rid of implicit numeric conversions: if (v.size() - 1 < 0) fails on empty vector in today's C++

- Deterministic resource management is still important and is _the_ feature that C++ gets praised for.

- Lack of standard ABI will cause a lot of headaches and lost time.

- I would tell him about LLVM IR, .NET assemblies, metadata and encourage him to first standardize an intermediate format which the compiler could read and write. That'd ensure seamless interoperability between compilers and even other languages.

- Related to the above point: the header/source split will become a burden.

  • C++ created disasters on maintenance. The best could happen to C++ it's to be killed for once et all with Go as a systems' language and Rust maybe for the rest.

In 1979 the “standard practice in C of passing a large struct to a function” wasn’t just not standard practice, it didn’t exist!

All you could pass as a parameter to a function were pointers to structs. In fact, with one exception, all parameters to functions were basically a machine word. Either a pointer or a full size int. Exception were doubles (and all floating point args were passed as doubles).

Hmm..maybe two exceptions? Not sure about long.

The treatment of structs as full values that could be assigned and passed to or returned from functions was only introduced in ANSI C, 1989.

And of course the correct recommendation to Bjarne would be: just look at what Brad is doing and copy that.

  • According to https://www.nokia.com/bell-labs/about/dennis-m-ritchie/chist..., which is authoritative:

    > During 1973-1980, the language grew a bit: the type structure gained unsigned, long, union, and enumeration types, and structures became nearly first-class objects (lacking only a notation for literals).

    And

    > By 1982 it was clear that C needed formal standardization. The best approximation to a standard, the first edition of K&R, no longer described the language in actual use; in particular, it mentioned neither the void or enum types. While it foreshadowed the newer approach to structures, only after it was published did the language support assigning them, passing them to and from functions, and associating the names of members firmly with the structure or union containing them. Although compilers distributed by AT&T incorporated these changes, and most of the purveyors of compilers not based on pcc quickly picked up them up, there remained no complete, authoritative description of the language.

    So passing structs entered the language before C89, and possibly was available in some compilers by 1979. I was very active in C during this period and was a member of X3J11 (I happen to be the first person ever to vote to standardize C, due to alphabetical order), but unfortunately I'm not able to pin down the timing from my own memory.

    P.S. Page of 121 of K&R C, first edition, says "The essential rules are that the only operations that you can perform on a structure are take its address with c, and access one of its members. This implies that structures may not be assigned to or copied as a unit, and that they cannot be passed to or returned from functions. (These restrictions will be removed in forthcoming versions.)"

    So they were already envisioning passing structs to functions in 1978.

It's super weird to say that we need rvalue references. rvalues with their odd semantics are only needed so that they don't break compatibility with the current reference/temporary rules. Instead passing object by move should be built into the language - each class should be moveable by default with proper support in the language

In 1979, the industry needed ‘C with Classes’. It did not need whatever is required today. Hence the only viable path is the one we’re on. Counter point - who is using Pony (the programming language) today? No one.

  • 7 years ago, my graduate distributed systems professor required everyone to complete his projects in Elixir because it was trending on HN. It was my first functional language and after getting over the initial hump I fell in love with it.

    Now, I’m teaching undergraduate courses of my own and, while I do not have the flexibility to change the languages used in my current offerings, if I ever start teaching a systems programming course I will absolutely require the students to use Pony.

  • Curious as to why you chose Pony as the language to use for your none used language. Any specific reason or was it just the first one that you thought of that fit the sentence?

    I’m not sure if Pony is still being used, but the language was making some headway, at least on the PLT side of things. I know the inclusion of some of their reference capabilities work (and practical implementation of prior research in the area) would be a benefit to greenfield programming language design. I think they missed going the process calculus route, instead choosing actors, but overall I liked the direction.