← Back to context

Comment by silasdb

6 years ago

The whole program trusts definitions in mac.h [1] like:

    #define IF if(
    #define THEN ){
    #define ELSE } else {
    #define ELIF } else if (
    #define FI ;}

    #define BEGIN {
    #define END }
    #define SWITCH switch(
    #define IN ){
    #define ENDSW }
    #define FOR for(
    #define WHILE while(
    ...

Isn't it nowadays considered bad practice? After taking a glance at the code, I see there might be some advantages like not forgetting to add missing {}. Is there any other explanation on why they created a dialect on the top of C using the preprocessor?

[1] https://www.tuhs.org/cgi-bin/utree.pl?file=V7/usr/src/cmd/sh...

EDIT: fix English

Bourne liked ALGOL. A lot. So much that he was one of the few people who wrote their own ALGOL-68 compiler. Using the preprocessor to feel more at home is a pretty good idea in this case.

This wasn't particularly popular to anyone who wasn't, well, Bourne, even at the time. I posted an example here: https://news.ycombinator.com/item?id=22199664

I’m under the impression that building up high level languages using macros was very common among assembly programmers, since C was new whoever wrote this may have come from assembly and taken the habit with them.

The author worked on ALGOL 68C and probably intended to reuse the more familiar syntax.

Nowadays, I would indeed consider it a bad practice to use such macros. Especially if you intend to share the project with anyone else.