← Back to context

Comment by WalterBright

19 hours ago

D is an elegant re-imagine of C and C++. For a trivial example,

    typedef struct S { int a; } S;

becomes simply:

    struct S { int a; }

and unlike C:

    extern int foo();
    int bar() { return foo(); }
    int foo() { return 6; }

you have:

    int bar() { return foo(); }
    int foo() { return 6; }

For more complex things:

    #include <foo.h>

becomes:

    import foo;

Your first example doesn't make sense, because

    struct S { int a; };

is also fine and idiomatic in C. It is rather

    typedef struct S { int a; } S;

that doesn't make sense, because why would you make something opaque and expose it immediately again in the same line?

The others are ... different. I can't tell whether they are really better. The second maybe, although I like it that the compiler forces me to forward type stuff, it makes the code much more readable. But then again I don't really get the benefit of

    import foo;

vs

    #include <foo>

.

include vs import is no difference. # vs nothing makes it clear that it is a separate feature instead of just a language keyword. < vs " make it clear whether you use your own stuff or stuff from the system. What do you do when your file contains spaces? Does import foo bar; work for including a file a single file, named "foo bar"?

  • > is also fine and idiomatic in C

    It's inelegant because without the typedef, you need to prefix it always with `struct`. This is inelegant because all other types do not need a prefix. It also makes it clumsier to refactor the code (adding or subtracting the leading `struct`). The typedef workaround is extremely commonplace.

    > I like it that the compiler forces me to forward type stuff, it makes the code much more readable

    That means when opening a file, you see the first part of the file first. In C, then you see a list of forward references. This isn't what you want to see - you want to see first the public interface, not the implementation details. (This is called "above the fold", coming from what you see in a folded stack of newspapers for sale. The headlines are not hidden below the fold or in the back pages.) In C, the effect of the forward reference problem is that people tend to organize the code backwards, with the private leaf functions first and the public functions last.

    > include vs import is no difference

    Oh, there is a looong list of kludgy problems stemming from a separate macro processor that is a completely distinct language from C. Even the expressions in a macro follow different rules than in C. If you've ever used a language with modules, you'll never want to go back to #include!

    > What do you do when your file contains spaces?

    A very good question! The module names must match the filename, and so D filenames must conform to D's idea of what an identifier is. It sounds like a limitation, but in practice, why would one want a module name different from its filename? I can't recall anyone having a problem with it. BTW, you can write:

        import core.stdc.stdio;
    

    and it will look up `core/stdc/stdio.d` (Linux, etc.) or `core\stdc\stdio.d` on Windows.

    • > It's inelegant

      We obviously disagree with the coding organization we prefer, so I find that rather elegant, but this doesn't sound like a substantial discussion. You as the language author are obviously quite content with the choices D made.

      > This is inelegant because all other types do not need a prefix.

      I don't find that. It makes it rather possible to clearly distinguish between transparent and opaque types. That these are a separate namespace makes it also possible to use the same identifier for the type and object, which is not always a good choice, but sometimes when there really is no point in inventing pointless names for one of the two, it really is. (So I can write struct message message; .) It also makes it really easy to create ad-hoc types, which honestly is my killer feature that convinced me to switch to C. I think this is the most elegant way to make creating new types for single use, short of getting rid of explicit types altogether.

      > It also makes it clumsier to refactor the code (adding or subtracting the leading `struct`).

      I never had that problem, and don't know when it occurs and why.

      > The typedef workaround is extremely commonplace.

      In my opinion that is not a workaround, but a feature. I also use typedefs when I want to declare an opaque type. This means that in the header file all function declarations refer to the opaque type, and in the implementation the type is only used with "struct". This also makes it obvious which types internals you are supposed to touch and which not. (This is also what e.g. the Linux style guide recommends.)

      > This isn't what you want to see - you want to see first the public interface, not the implementation details.

      Maybe you, but I don't. As in C public interface and implementation are split into different files, this problem doesn't occur. When I want to see the interface, I'm going to read the interface definition. When I look into the implementation file, I definitely don't expect to read the interface. What I rather see is first the dependencies (includes) and then the internal types. This fits "Show me your flowchart and conceal your tables, and I shall continue to be mystified. Show me your tables, and I won't usually need your flowchart; it'll be obvious." . Then I typically see default values and configuration. Afterwards yes, I see the lowest methods.

      > people tend to organize the code backwards, with the private leaf functions first and the public functions last.

      Which results in a consistent organization. It also fits how you would write in in math or an academic context, that you only use what is already defined. It makes the file readable from top to bottom. When you are just looking for a specific thing, instead of trying to read it in full, you are searching and jumping around anyway.

      > Oh, there is a looong list of kludgy problems stemming from a separate macro processor that is a completely distinct language from C. Even the expressions in a macro follow different rules than in C. If you've ever used a language with modules, you'll never want to go back to #include!

      A macro language is surprising for the newcomer, but you get used to it, and I don't think there is a problem with include. Textual inclusion is kind of the easiest mental modal you can have and is easy to control and verify. Coming from a language with modules, before learning C, I never found that to be an issue, and rather find the emphasis on the bare filesystem rather refreshing.

      > but in practice, why would one want a module name different from its filename?

      True, I actually never wanted to include a file with spaces, but it is something where your concept breaks. Also you can write #include "foo/bar/../baz" just fine, and can even use absolute paths, if you feel like it.

      2 replies →

Smoe of these are definitely nice-to-haves*, but when you're evaluating a C++ alternative, there are higher priority features to research first.

How are the build times? What does its package system(s) look like, and how populated are they? What are all its memory management options? How does it do error handling and what does that look like in real world code? Does it have any memory safety features, and what are their devtime/comptime/runtime costs? Does it let me participate in compile time optimizations or computations?

Don't get me wrong, we're on the same page about wanting to find a language that fills the C++ niche, even if it will never be as ideal as C++ in some areas (since C++ is significantly worse in other areas, so it's a fair trade off). But just like dating, I'm imagining the fights I'll have with the compiler 3 months into a full time project, not the benefits I'll get in the first 3 days.

* (a) I've been using structs without typedef without issue lately, which has its own benefits such as clarifying whether the type is simple or aggregate in param lists, while auto removes the noise in function bodies. (b) Not needing forward declarations is convenient, but afaik it can't not increase compile times at least somewhat. (c) I like the consistency here, but that's merely a principle; I don't see any practical benefit.

  • Build times are quite a bit faster.

    The package system is called dub.

    Memory management options include:

    1. stack allocation

    2. malloc allocation

    3. write your own allocator

    4. static allocation

    5. garbage collection

    You can use exceptions or returns for error handling.

    The biggest memory safety feature it has is length-delimited arrays. No more array overflows! The cost of it is the same as in std::vector when you do the bounds checked option. D also uses refs, relegating pointers to unusual uses. I don't know what you mean by "participating in optimizations".

    (a) C doesn't have the hack that C++ has regarding the tag names. D has auto.

    (b) D has much faster compile times than C++.

    (c) The practical benefit is the language is much easier to master.

Everything except the import looks like standard c++ since at least 98.

  • C++ does not allow forward references outside of structs. The point-of-instantiation and point-of-declaration rules for templates produces all kinds of subtle problems. D does not have that issue.

    Yes, you absolutely can get the job done with C and C++. But neither is an elegant language, and that puts a cognitive drag on writing and understanding code.