This article doesn't use the name "Lisp" enough. The language with the best chance of lasting a long time is the one with the simplest syntax. That is Lisp. It is already one of the oldest programming languages and the lisp family of languages are still mutually intelligible to each other despite being large. The language core hasn't settled and the rest of the programming community has nearly caught up with the Common Lisp standard library.
Will a Lisp ever be the most popular language? Probably not. Maybe. Will they last 100 years? Easily. One or multiple of the current Lisps will still be there. If computers exist in 2123, someone will be making money using a current Lisp. Hopefully they'll be using one that has discovered the words "first" and "rest".
Personally, the biggest benefit of lisps is that features to the language can be added by not only the language designers, but by the users.
JavaScript wanted to add the synthetic sugar for async/await, so the language had to be redesigned with that in mind. In a lisp, async/await could be implemented by a library instead, keeping the core still small and nimble.
This of course is also a foot gun when used improperly, where some codebases can be harder for new developers to grok everything, as they basically have to learn a new language built on top of the standard language. But I've found that if you keep the "new language features" to a minimum and have good documentation, developers quickly get onboard and start to be productive.
That footgun doesn't seem to be a huge problem. The culture around macros seems to be "avoid if possible, use if necessary". It just becomes a skill like any other related to programming.
The alternative is _extremely costly_ in comparison. Code generators, transpilers, configuration, tooling, bespoke IDE features... All of that, diverging from each other in the most idiosyncratic ways and it all needs version control, RFCs, release management, documentation, design and development, breaking changes...
But with an extensible language you have all of this for basically free. People just make things, share things and the most useful and stable things bubble up.
Javascript programmers have to learn frameworks. You cannot escape from the fact that programs extend the language.
When you write a function, or define a new type, you're extending a language.
In human languages, nobody would dispute the idea that the formation of new nouns and verbs extends the language.
Lisp is kind of like a language where you can add a new conjunction and preposition, not just a new noun or verb. To the uninitiated, that somehow seems like it must be harder.
The fundamental problem is that you have some piece of syntax, and it denotes some process (or whatever) which is entirely implicit. Whether that syntax is a new kind of macro-statement or function call is immaterial. If you have no idea what it is, you have to go to the definition.
The behavior behind a single function call can be mind-bogglingly complex.
> JavaScript wanted to add the synthetic sugar for async/await, so the language had to be redesigned with that in mind. In a lisp, async/await could be implemented by a library instead, keeping the core still small and nimble.
Sure, but try hacking in infix notation or more complex notations like list[index] access, and you'll quickly see why hacking that stuff in is a bad idea. Lisp severely punishes adding syntactic sugar if it diverges at all from prefix s-expressions. Look at Clojure's member variable access for a real life example of how this plays out.
And if we're willing to make concessions that our syntactic sugar can only be as sweet as the overall design of the language allows, I think it makes sense to concede the same to Javascript and admit that Promises existed as libraries for years before async/await, and worked just fine in lots of production code.
> The language with the best chance of lasting a long time is the one with the simplest syntax.
If you're going to make an argument for Lisp, I think focusing on syntax is the weakest argument you could make. Simplicity is good, sure, but syntactic simplicity is a very surface-level form of simplicity. Consider:
(def inc (n)
(+ n 1))
def inc(n):
return n + 1
fn inc(n) {
return n + 1;
}
inc(N) -> N + 1.
These are fictional example syntaxes, but you can see where my inspirations come from. The point is, these all express the same things. There's some argument about which syntax is clearest, but that's mostly going to be based on what languages people already know. It's a bit silly to argue what's clearest from some sort of ideal pure world where people coming into your language don't know any other languages, because that's not the world we live in.
Now consider:
(def void incAll (((list int) ns))
(decl int i)
(for (= i 0) (< i (length ns)) (++ i)
(++ ([] ns i))))
def incAll(ns) {
return map(ns, n => n + 1);
}
In the first example, we're doing C-ish things in Lisp-ish syntax, and in the second example we're doing Lisp-ish things in C-ish syntax. As you can see, doing Lisp-ish things in C-ish syntax works pretty well (and lots of modern languages do this). But doing C-ish things in Lisp-ish syntax is an abomination--in fact, the simpler syntax actually forces us to do more weird stuff to get around not having more complex syntax for more complex operations.
This gives us a clue that maybe simple syntax isn't inherently simpler to use. At least some of the simplicity of Lisp comes from the other ideas it brings to the table. And notably, nothing prevents us from using those ideas in other languages.
Discussion of Lisp syntax can't fail to mention that Lisp's simpler expressions enable its powerful macros. Lisp true believers will wax poetic about how Lisp macros allow you to create domain specific languages. But in practice, macros are are often just an opportunity to shoot yourself in the foot and create hard-to-debug bugs. If you're introducing macros, the simplicity argument starts to fall apart because you're adding a massive amount of complexity.
Macros are one thing that you easily get from that syntax. And I would argue that it's far less of a footgun than you make it out to be.
But really there are many things that aren't mentioned such as Editor/IDE tooling, in-editor REPL, evaluating expressions, debugging, structural editing, code navigation, formatting...
Your example is actually kind of misleading, because the second variation is much closer to how you write in a Lisp than the first.
It would really be something like:
```
(def inc-all (partial map inc))
```
It really makes no sense to use a Lisp in a non expression based manner. The syntax is inherently optimized for it.
The latter is actually valid Ruby code, but the former is not valid in any programming language I’m aware of. Yet they are a simple token substitute version of each other. I purposefully placed spaces in the latter to better reflect that.
Note that going a tiny bit further, you could easily get rid of the "apply" token in the former with some convention regarding precedence in denotation.
And yet the whole industry prefer to clutter their communicated ideas with all kind of symbols that are impenetrable to the layman.
Of course, this doesn’t mean much, as Ruby and Python will most likely have a huge hit count, if not most, unrelated to programming langues. That also a lesson for naming programming languages, I guess. As everything vaguely named in it, C is really awkward on this regard.
Lisp is a language family. Popular, well known choices include for example: Common Lisp, Scheme, Racket, Emacs Lisp and Clojure.
There are countless production systems written in these languages, ranging from embedded, to web apps to infrastructure tooling. The specific domains where they're applied, are just as diverse as one could imagine.
The more interesting question is "Why would someone use any of these languages?".
Niche languages are typically associated with risk in the business world. But the thing is that Lisp just keeps surviving, evolving and finding new problems and domains to tackle.
My personal opinion is that these languages represent the powerful combination of freedom, stability and engagement.
A Lisp is inherently non-condescending as it gives you more powerful tools than most other languages, but it's also very reliable because it's built on a very well understood, minimal foundation. Last but not least you are programming in a way that is very engaging. You are right there in the running program.
Some web apps also run Clojure (a lisp for the JVM) for backend and ClojureScript (Clojure compiled to JavaScript) for frontend. Probably Nubank ("largest fintech bank in Latin America") is the biggest company I know using Clojure in production in various ways.
> (on SIMD) C never really added any kind of abstraction for them.
ISPC and various C-derived GPU shading languages would beg to differ. But then the next question is: are high-level abstractions and 'compiler magic' even all that useful for SIMD, or are intrinsics the better solution?
None of those things are part of the C standard of course, but in the real world the C standard does not matter - only the feature set of actually existing compilers does.
Also I wish that obsession with the PDP-11 would finally die. C seems to be a pretty good match for all sorts of von-Neumann-computers, otherwise it would have died already.
IMHO programming languages live and die with the hardware they need to - uh - 'program', a hardware architecture which requires an entirely new approach to programming will naturally require entirely different programming languages (and our computers still are close enough to computers from the 50's and 60's that the same programming languages map pretty well).
It makes a lot more sense to speculate what hardware will look like in 100 years, because only this will tell what programming languages might look like.
and then when using that type in C, gcc alot times will do a great job vectorizing without having to reach towards intrinsics when looking at the generated code.
Sure it's not standard C, but it still feel like C when using them. Even for things like shuffling it's still pretty logical to just treat it as a function for instance "__builtin_shuffle" that gcc provides to use on those vector types.
The semantics of C can handle such things, it's just not standardized if you get what I am saying. So I get what your saying about the PDP-11.
Just because certain shading languages put on a C-like syntax doesn’t make them anywhere close to C. They have entirely different semantics. So no, C is just terrible when it comes to SIMD, but also when it comes to multi-threading, two areas that are the most important inprovements of modern hardware.
Some are less different than others, for instance:
"The Metal programming language is a C++14-based Specification with extensions and restrictions. Refer to the C++14 Specification (also known as the ISO/IEC JTC1/SC22/WG21 N4431 Language Specification) for a detailed description of the language grammar."
Meaning essentially that MSL is C++14 with a couple of restrictions (mostly concerning pointers), a couple of SIMD data types, and some custom attributes.
...the C++14 could just as well be replaced with some C standard, if the Metal designers wouldn't like C++ so much for mysterious reasons (and that arguably would have been the better choice - because for shader programming C++ really doesn't add that much useful stuff over C).
Prolog got half-way there last year. It will go the distance. C as well. Lisp of course. The reason is that they define their paradigm. C#, C++, Rust, Ruby, Java are neither fish nor fowl and will not make it. Python has become too useful not to survive. I started with Cobol and would happily dance on it's grave before I pass.
Come on, Java is so insanely big that it simply will go there from momentum alone.
Many says it is the new Fortran/Cobol, especially in finance but it has something special — the JVM. Plenty of old software continues to run on virtualized hardware simply because they depend on a given CPU architecture’s quirks and can’t be ported. The JVM is well-specified and thus any program programmed against it can run indefinitely, independent of hardware. Being cross-platform is also “vertical” in that past and future architectures can also be supported.
And on top of that, tell me any other platform with a specification of both language and runtime (where even data races are not completely UB) that has as many independent implementation, many of which are supported by FAANG companies and could be developed alone further if anything were to happen to the others? Like, Alibaba alone could continuously develop the platform.
If by C, you mean, descendants of C, I agree with you. If you mean C itself will survive, I strongly disagree.
I think with the recent focus on safety in languages, Rust and Ada, especially SPARK, will have better futures than C and C++. I think business is going to turn against C[++] for new projects, and there will be a push to rewrite C[++] code in safe languages, especially for operating systems, networking, and security code , making C[++] the next COBOL.
Since safety isn't something you can simply bolt onto a language, many other safe languages will crop up to replace the rest of the unsafe languages.
If we try to predict the future instead of reviewing the past I speculate that in the future (complex) programming languages will not be necessary since a big amount of code written in the world is redundant. So minimum declarative stuff will be enough to create the amount of software now built by lot of engineers.
On the other hand, it is interesting to discuss what will happen with low level programming such as writing drivers, operating systems, and browsers. Some of these could be generated from an spec. I remember HN posts such as "A full TCP/IP stack in under 200 LoC (and the power of DSLs)" [1] that sounds like a toy now but hope to see in the near future to build complete stacks and just fill the spaces between.
> programming languages will not be necessary since a big amount of code written in the world is redundant. So minimum declarative stuff will be enough to create the amount of software now built by lot of engineers.
I am not hopeful about that. As a counter example look at parsing. We have studied this for over 50 years. We have really good theory. We even have a good way to do specifications (EBNF). And we even have parser generators such as yacc.
Yet despite this, for reasons of flexibility, performance, and/or good errors, pretty much every production compiler is using a hand written recursive decent parser.
Most problems in computing don’t have nearly the formal theory and study that parsing does. If we can’t make parsing work in the real world, I am not hopeful about the other stuff.
I mean as a field we love to reinvent the wheel over and over again, even in the same language if it's not in the stdlib (or even if it is) it's not uncommon to find multiple versions of the same thing. Take for example SQL database drivers in python, I have at various companies worked with at least 4 of them. They all do virtually the same thing with some minor differences.
So I don't think we will ever land in a situation where everyone uses the same thing.
(1) I didn't know that COBOL was Grace Hopper’s baby! But I've spent decades
trying to avoid learning too much about COBOL lest I accidentally back
into having to support a COBOL system.
(2) No mention of PL/I in the vaguely Algol-like group. I liked PL/I!
(3) I'd never heard of Laravel, and was pretty okay with that.
(4) "One of C’s old promises was to act like a PDP-11 computer."
I wasn't aware of this assertion. I'll have to think about it.
Do the increment / decrement operations mirror the PDP-11's addressing modes? I guess maybe they do!
Actually, it isn't, is spite of commonly being referred to as such. From Wikipedia:
* "Designed by Howard Bromberg, Norman Discount, Vernon Reeves, Jean E. Sammet, William Selden, Gertrude Tierney, with indirect influence from Grace Hopper"
* "[Grace Hopper] did not participate in its work except through the general guidance she gave to her staff who were direct committee members. Thus, while her indirect influence was very important, regrettably the frequent repeated statements that "Grace Hopper developed Cobol" or "Grace Hopper was a codeveloper of Cobol" or "Grace Hopper is the mother of Cobol" are just not correct."
I think it is interesting comparing computer languages to human languages. Reading Charles Dickens - a bit over 100 years old - is fascinating. Large parts of it are completely intelligible, and then all of a sudden there a phrase or bit of slang which just doesn't make any sense.
Even listening to interviews with pop groups or politicians from the 1960s can produce a weird kind of cognitive dissonance - you know they're speaking the same language as you, but discussing long-dead concepts with outmoded words.
I wonder if computer science will ever have the equivalent of Shakespearean scholars, trying to decipher the meaning or esoteric comments?
I'm not sure it's right about algol being dead, I've seen a renewed interest in it over the last couple of years, and I believe I even saw a new algol compiler - but I can't remember if it was for -68 or -60.
ALGOL, its Wirthian descendants (Pascal, Modula-2, Oberon) and Ada were designed to be safe languages. Partly due to Pascal's weaknesses in systems programming and partly due to the low speed of computers at the time, C took over, which was a shame because it is one of the least safe languages, but networking hadn't become common yet and people wanted to save every processor cycle possible and didn't want to waste any on safety. I think this resurgence of focus on safety could bring some life back to this line of ALGOL descendants. Of course Ada is still used worldwide in high reliability systems such as aircraft systems, air traffic control systems, and spacecraft. There is Ada code in NASA's Artemis project.
> Fortran is significantly faster than C, for instance.
That's news to me. Languages that are close enough to the hardware and allow for inline assembly can probably never be described as slow overall. Perhaps the author refers to some specific libraries?
> Java is the most recent popular general-purpose language
This post was written in 2022. Does the author not know about Python? Javascript? Rust? C#? and a bunch of others?
> Fortran is significantly faster than C, for instance.
AFAIK the main reason is pointer aliasing, e.g. you may need to sprinkle C code with the restrict keyword to work around the issue. "Significantly faster" is debatable of course, I bet it's possible both in Fortran and C to write code that performans equally bad.
So you're saying that the aliasing defaults of C are an issue. Well... I suppose, but - not `restrict`ing your pointers in the hottest part of the code is a bit like not compiling with optimizations on.
>> Java is the most recent popular general-purpose language
> This post was written in 2022. Does the author not know about Python? Javascript? Rust? C#? and a bunch of others?
I had to double check this - but Python is several years older than Java. Wikipedia lists 1991 for its first release, vs 1995 for Java.
That said, I felt like Python became really well-known much later than Java (which had massive hype and enthusiasm in the 90s) - so I do actually agree with you listing it here.
Unsurprisingly, Java is still the most searched term in the "Programming" category. JavaScript is catching up, but still have some way to go. Rust barely registers.
Worth keeping in mind is that what's popular in startup circles (like I'm guessing a lot of HN users come from) isn't what's popular in the 90% other types of businesses.
Fair enough - it's more fashionable that popular :-) It did close TIOBE's top 20 for 2022: https://www.tiobe.com/tiobe-index/ (and that includes non-general-purpose languages)
... but you're right in that I could have probably chosen Go or TypeScript. The point is that Java is not remotely the latest popular general-purpose programming language.
As a user, i care more about portability between those programming languages. It's the best of both worlds.
Nowadays, i tend to focus on polyglot programming, where i can pick the best tool for the job, it's the true Single Responsibility Principle that works for me.
I also love language wars, because from them, i can know the tradeoffs of each language more clearly.
Human language doesn’t stay the same over 100 years. Is it reasonable to expect that from a computer language?
To be fair, they are different. But, I think close enough to make the analogy stand.
Different idioms, ways of expression arise constantly: come into and out of vogue. Human languages aren’t rigorously defined like computer languages must be. Human languages are loosely specifically by the amorphous collection of “all speakers” (whatever that means). And they evolve along those lines with each generation reshaping the language. Computer languages also change but they tend to hold onto their baggage much more. E.g. C is drowning in all the unchangable legacy from the 70s and 80s.
Is a 100 yr programming language possible? Sure. But it’s gonna feel a lot like you’re a scholar in 2023 writing a modern academic technical paper in Latin or Early Modern English. Doable, yes. But comical.
In fact, this is arguably already the state of C. Very useful programs are still written in it. And it feels like Early Modern English. Does that matter? Idk. But as the current trend in programming goes, most young engineers won’t be happy with this dynamic.
Personally, I suspect we get it just because we have so many damn programs in the world. But I can’t imagine anyone will count it as a success.
It's a very interesting article, but the lack of citations backing up some of the big claims make me doubt its premises. It's a shame because the author seems to know his programming language history. The most important IMO is this point:
> Fortran is significantly faster than C
Then later:
> Is Performance Necessary for a Hundred-Year Language?
> Fortran, one of the oldest thriving languages, lives and dies on performance. So that’s a check mark in the “yes” column.
I did some googling and found that there are some design choices that might make Fortran faster, but it's highly debatable whether it's "significantly faster than C".
Given how hardware has advanced over the years, "performance" just seems a weird feature to look for in a 100 year language too. You don't build performant software by building it in Fortran or C, you do it by making sensible design choices and writing thoughtful code, and not over-architecting or over-complicating provisioning.
IMO C is obviously going to be a 100 year programming language, and I came away a bit disappointed the article didn't quite fully claim that.
An interesting and thought provoking read nonetheless.
In the past, Fortran was always faster than C in scientific computation applications, mainly because many optimizations are prevented when compiling C, by the risk of pointer aliasing.
In modern C, such optimizations are possible by using "restrict".
Nevertheless, few C programmers bother to write "restrict" wherever it should be used, so casually written C code remains slower than similarly written Fortran code.
Moreover, modern Fortran has a lot of builtin operations for multidimensional arrays, which can be implemented in an optimized way by the compiler.
Writing simple C code cannot achieve a comparable effect.
Achieving a similar performance in C would require a special optimized library and this would result in extremely verbose and ugly C code. It is really impractical to do this in C and C++ must be used, with operator overloading, iterators and templates being needed to write a custom library that can match or exceed the builtin features of Fortran.
I would disagree that C is going to be a 100 year programming language. I think the recent focus on safety is going to not only turn businesses away from using C but that they will active push C out of their code bases in the quest for safety. C was fine back before networking was everywhere. Today, it's a disaster waiting to happen in every C program. Sure, there are C programmers who claim they can write safe C code, but the numerous hacks prove they can't. Why wouldn't a programmer prefer a language that handles most of the safety for you, and is designed to allow for better optimization to boot?
I don't think I'd claim anything about what computers are "obviously" going to be in 50 years' time. AI, quantum computing, more advanced BMIs (thought-controlled computers etc.) and who-knows-what could disrupt everything we know in that time easily. As could civilisational collapse, though I'm certainly willing to bet there'll still be computers and programming languages of some sort in another 50 years.
This article doesn't use the name "Lisp" enough. The language with the best chance of lasting a long time is the one with the simplest syntax. That is Lisp. It is already one of the oldest programming languages and the lisp family of languages are still mutually intelligible to each other despite being large. The language core hasn't settled and the rest of the programming community has nearly caught up with the Common Lisp standard library.
Will a Lisp ever be the most popular language? Probably not. Maybe. Will they last 100 years? Easily. One or multiple of the current Lisps will still be there. If computers exist in 2123, someone will be making money using a current Lisp. Hopefully they'll be using one that has discovered the words "first" and "rest".
Personally, the biggest benefit of lisps is that features to the language can be added by not only the language designers, but by the users.
JavaScript wanted to add the synthetic sugar for async/await, so the language had to be redesigned with that in mind. In a lisp, async/await could be implemented by a library instead, keeping the core still small and nimble.
This of course is also a foot gun when used improperly, where some codebases can be harder for new developers to grok everything, as they basically have to learn a new language built on top of the standard language. But I've found that if you keep the "new language features" to a minimum and have good documentation, developers quickly get onboard and start to be productive.
That footgun doesn't seem to be a huge problem. The culture around macros seems to be "avoid if possible, use if necessary". It just becomes a skill like any other related to programming.
The alternative is _extremely costly_ in comparison. Code generators, transpilers, configuration, tooling, bespoke IDE features... All of that, diverging from each other in the most idiosyncratic ways and it all needs version control, RFCs, release management, documentation, design and development, breaking changes...
But with an extensible language you have all of this for basically free. People just make things, share things and the most useful and stable things bubble up.
1 reply →
Javascript programmers have to learn frameworks. You cannot escape from the fact that programs extend the language.
When you write a function, or define a new type, you're extending a language.
In human languages, nobody would dispute the idea that the formation of new nouns and verbs extends the language.
Lisp is kind of like a language where you can add a new conjunction and preposition, not just a new noun or verb. To the uninitiated, that somehow seems like it must be harder.
The fundamental problem is that you have some piece of syntax, and it denotes some process (or whatever) which is entirely implicit. Whether that syntax is a new kind of macro-statement or function call is immaterial. If you have no idea what it is, you have to go to the definition.
The behavior behind a single function call can be mind-bogglingly complex.
1 reply →
> JavaScript wanted to add the synthetic sugar for async/await, so the language had to be redesigned with that in mind. In a lisp, async/await could be implemented by a library instead, keeping the core still small and nimble.
Sure, but try hacking in infix notation or more complex notations like list[index] access, and you'll quickly see why hacking that stuff in is a bad idea. Lisp severely punishes adding syntactic sugar if it diverges at all from prefix s-expressions. Look at Clojure's member variable access for a real life example of how this plays out.
And if we're willing to make concessions that our syntactic sugar can only be as sweet as the overall design of the language allows, I think it makes sense to concede the same to Javascript and admit that Promises existed as libraries for years before async/await, and worked just fine in lots of production code.
> The language with the best chance of lasting a long time is the one with the simplest syntax.
If you're going to make an argument for Lisp, I think focusing on syntax is the weakest argument you could make. Simplicity is good, sure, but syntactic simplicity is a very surface-level form of simplicity. Consider:
These are fictional example syntaxes, but you can see where my inspirations come from. The point is, these all express the same things. There's some argument about which syntax is clearest, but that's mostly going to be based on what languages people already know. It's a bit silly to argue what's clearest from some sort of ideal pure world where people coming into your language don't know any other languages, because that's not the world we live in.
Now consider:
In the first example, we're doing C-ish things in Lisp-ish syntax, and in the second example we're doing Lisp-ish things in C-ish syntax. As you can see, doing Lisp-ish things in C-ish syntax works pretty well (and lots of modern languages do this). But doing C-ish things in Lisp-ish syntax is an abomination--in fact, the simpler syntax actually forces us to do more weird stuff to get around not having more complex syntax for more complex operations.
This gives us a clue that maybe simple syntax isn't inherently simpler to use. At least some of the simplicity of Lisp comes from the other ideas it brings to the table. And notably, nothing prevents us from using those ideas in other languages.
Discussion of Lisp syntax can't fail to mention that Lisp's simpler expressions enable its powerful macros. Lisp true believers will wax poetic about how Lisp macros allow you to create domain specific languages. But in practice, macros are are often just an opportunity to shoot yourself in the foot and create hard-to-debug bugs. If you're introducing macros, the simplicity argument starts to fall apart because you're adding a massive amount of complexity.
Macros are one thing that you easily get from that syntax. And I would argue that it's far less of a footgun than you make it out to be.
But really there are many things that aren't mentioned such as Editor/IDE tooling, in-editor REPL, evaluating expressions, debugging, structural editing, code navigation, formatting...
Your example is actually kind of misleading, because the second variation is much closer to how you write in a Lisp than the first.
It would really be something like:
```
(def inc-all (partial map inc))
```
It really makes no sense to use a Lisp in a non expression based manner. The syntax is inherently optimized for it.
4 replies →
To my mind nothing can be as readable as:
Which is a more casual way to express:
The latter is actually valid Ruby code, but the former is not valid in any programming language I’m aware of. Yet they are a simple token substitute version of each other. I purposefully placed spaces in the latter to better reflect that.
Note that going a tiny bit further, you could easily get rid of the "apply" token in the former with some convention regarding precedence in denotation.
And yet the whole industry prefer to clutter their communicated ideas with all kind of symbols that are impenetrable to the layman.
Just for fun, have a look at Ngram results for some of these programming language name
https://books.google.com/ngrams/graph?content=Lisp%2CRuby%2C...
Of course, this doesn’t mean much, as Ruby and Python will most likely have a huge hit count, if not most, unrelated to programming langues. That also a lesson for naming programming languages, I guess. As everything vaguely named in it, C is really awkward on this regard.
What is Lisp used for in production these days? The wiki didn’t really specify, just that it’s connected to mathmatics and AI research.
Lisp is a language family. Popular, well known choices include for example: Common Lisp, Scheme, Racket, Emacs Lisp and Clojure.
There are countless production systems written in these languages, ranging from embedded, to web apps to infrastructure tooling. The specific domains where they're applied, are just as diverse as one could imagine.
The more interesting question is "Why would someone use any of these languages?".
Niche languages are typically associated with risk in the business world. But the thing is that Lisp just keeps surviving, evolving and finding new problems and domains to tackle.
My personal opinion is that these languages represent the powerful combination of freedom, stability and engagement.
A Lisp is inherently non-condescending as it gives you more powerful tools than most other languages, but it's also very reliable because it's built on a very well understood, minimal foundation. Last but not least you are programming in a way that is very engaging. You are right there in the running program.
3 replies →
Some web apps also run Clojure (a lisp for the JVM) for backend and ClojureScript (Clojure compiled to JavaScript) for frontend. Probably Nubank ("largest fintech bank in Latin America") is the biggest company I know using Clojure in production in various ways.
How about this? https://aws.amazon.com/marketplace/pp/prodview-otb76awcrb7aa
Or this? https://penpot.app/
Or this? https://whimsical.com/
This web site, for example.
1 reply →
> (on SIMD) C never really added any kind of abstraction for them.
ISPC and various C-derived GPU shading languages would beg to differ. But then the next question is: are high-level abstractions and 'compiler magic' even all that useful for SIMD, or are intrinsics the better solution?
None of those things are part of the C standard of course, but in the real world the C standard does not matter - only the feature set of actually existing compilers does.
Also I wish that obsession with the PDP-11 would finally die. C seems to be a pretty good match for all sorts of von-Neumann-computers, otherwise it would have died already.
IMHO programming languages live and die with the hardware they need to - uh - 'program', a hardware architecture which requires an entirely new approach to programming will naturally require entirely different programming languages (and our computers still are close enough to computers from the 50's and 60's that the same programming languages map pretty well).
It makes a lot more sense to speculate what hardware will look like in 100 years, because only this will tell what programming languages might look like.
I agree for instance gcc's vector size attribute works just fine in C. For instance I could declare a type like this:
typedef uint32_t myVec __attribute__((vector_size(16)));
and then when using that type in C, gcc alot times will do a great job vectorizing without having to reach towards intrinsics when looking at the generated code.
Sure it's not standard C, but it still feel like C when using them. Even for things like shuffling it's still pretty logical to just treat it as a function for instance "__builtin_shuffle" that gcc provides to use on those vector types.
The semantics of C can handle such things, it's just not standardized if you get what I am saying. So I get what your saying about the PDP-11.
Just because certain shading languages put on a C-like syntax doesn’t make them anywhere close to C. They have entirely different semantics. So no, C is just terrible when it comes to SIMD, but also when it comes to multi-threading, two areas that are the most important inprovements of modern hardware.
Some are less different than others, for instance:
"The Metal programming language is a C++14-based Specification with extensions and restrictions. Refer to the C++14 Specification (also known as the ISO/IEC JTC1/SC22/WG21 N4431 Language Specification) for a detailed description of the language grammar."
(from: https://developer.apple.com/metal/Metal-Shading-Language-Spe...)
Meaning essentially that MSL is C++14 with a couple of restrictions (mostly concerning pointers), a couple of SIMD data types, and some custom attributes.
...the C++14 could just as well be replaced with some C standard, if the Metal designers wouldn't like C++ so much for mysterious reasons (and that arguably would have been the better choice - because for shader programming C++ really doesn't add that much useful stuff over C).
2 replies →
I use Perl because the community prioritizes backwards compatibility.
For the most part, you can run the same Perl script over 25 years of Perl releases, going all the way back to 5.000.
This is one of the most important features for me, and I'm so grateful.
Prolog got half-way there last year. It will go the distance. C as well. Lisp of course. The reason is that they define their paradigm. C#, C++, Rust, Ruby, Java are neither fish nor fowl and will not make it. Python has become too useful not to survive. I started with Cobol and would happily dance on it's grave before I pass.
Come on, Java is so insanely big that it simply will go there from momentum alone.
Many says it is the new Fortran/Cobol, especially in finance but it has something special — the JVM. Plenty of old software continues to run on virtualized hardware simply because they depend on a given CPU architecture’s quirks and can’t be ported. The JVM is well-specified and thus any program programmed against it can run indefinitely, independent of hardware. Being cross-platform is also “vertical” in that past and future architectures can also be supported.
And on top of that, tell me any other platform with a specification of both language and runtime (where even data races are not completely UB) that has as many independent implementation, many of which are supported by FAANG companies and could be developed alone further if anything were to happen to the others? Like, Alibaba alone could continuously develop the platform.
If by C, you mean, descendants of C, I agree with you. If you mean C itself will survive, I strongly disagree.
I think with the recent focus on safety in languages, Rust and Ada, especially SPARK, will have better futures than C and C++. I think business is going to turn against C[++] for new projects, and there will be a push to rewrite C[++] code in safe languages, especially for operating systems, networking, and security code , making C[++] the next COBOL.
Since safety isn't something you can simply bolt onto a language, many other safe languages will crop up to replace the rest of the unsafe languages.
If we try to predict the future instead of reviewing the past I speculate that in the future (complex) programming languages will not be necessary since a big amount of code written in the world is redundant. So minimum declarative stuff will be enough to create the amount of software now built by lot of engineers.
On the other hand, it is interesting to discuss what will happen with low level programming such as writing drivers, operating systems, and browsers. Some of these could be generated from an spec. I remember HN posts such as "A full TCP/IP stack in under 200 LoC (and the power of DSLs)" [1] that sounds like a toy now but hope to see in the near future to build complete stacks and just fill the spaces between.
[1] https://news.ycombinator.com/item?id=846028
> programming languages will not be necessary since a big amount of code written in the world is redundant. So minimum declarative stuff will be enough to create the amount of software now built by lot of engineers.
I am not hopeful about that. As a counter example look at parsing. We have studied this for over 50 years. We have really good theory. We even have a good way to do specifications (EBNF). And we even have parser generators such as yacc.
Yet despite this, for reasons of flexibility, performance, and/or good errors, pretty much every production compiler is using a hand written recursive decent parser.
Most problems in computing don’t have nearly the formal theory and study that parsing does. If we can’t make parsing work in the real world, I am not hopeful about the other stuff.
I mean as a field we love to reinvent the wheel over and over again, even in the same language if it's not in the stdlib (or even if it is) it's not uncommon to find multiple versions of the same thing. Take for example SQL database drivers in python, I have at various companies worked with at least 4 of them. They all do virtually the same thing with some minor differences.
So I don't think we will ever land in a situation where everyone uses the same thing.
> So minimum declarative stuff will be enough to create the amount of software now built by lot of engineers.
That idea is the nuclear fusion of software development, it's always just 15 years away, even 50 years ago ;)
Do you believe it would happen?
2 replies →
(1) I didn't know that COBOL was Grace Hopper’s baby! But I've spent decades trying to avoid learning too much about COBOL lest I accidentally back into having to support a COBOL system.
(2) No mention of PL/I in the vaguely Algol-like group. I liked PL/I!
(3) I'd never heard of Laravel, and was pretty okay with that.
(4) "One of C’s old promises was to act like a PDP-11 computer." I wasn't aware of this assertion. I'll have to think about it. Do the increment / decrement operations mirror the PDP-11's addressing modes? I guess maybe they do!
https://en.wikipedia.org/wiki/PDP11_architecture#Addressing_...
Actually, it isn't, is spite of commonly being referred to as such. From Wikipedia:
* "Designed by Howard Bromberg, Norman Discount, Vernon Reeves, Jean E. Sammet, William Selden, Gertrude Tierney, with indirect influence from Grace Hopper"
* "[Grace Hopper] did not participate in its work except through the general guidance she gave to her staff who were direct committee members. Thus, while her indirect influence was very important, regrettably the frequent repeated statements that "Grace Hopper developed Cobol" or "Grace Hopper was a codeveloper of Cobol" or "Grace Hopper is the mother of Cobol" are just not correct."
I think it is interesting comparing computer languages to human languages. Reading Charles Dickens - a bit over 100 years old - is fascinating. Large parts of it are completely intelligible, and then all of a sudden there a phrase or bit of slang which just doesn't make any sense.
Even listening to interviews with pop groups or politicians from the 1960s can produce a weird kind of cognitive dissonance - you know they're speaking the same language as you, but discussing long-dead concepts with outmoded words.
I wonder if computer science will ever have the equivalent of Shakespearean scholars, trying to decipher the meaning or esoteric comments?
>I wonder if computer science will ever have the equivalent of Shakespearean scholars, trying to decipher the meaning or esoteric comments
They are already there. Usually you call them colleagues. :D
I'm not sure it's right about algol being dead, I've seen a renewed interest in it over the last couple of years, and I believe I even saw a new algol compiler - but I can't remember if it was for -68 or -60.
ALGOL, its Wirthian descendants (Pascal, Modula-2, Oberon) and Ada were designed to be safe languages. Partly due to Pascal's weaknesses in systems programming and partly due to the low speed of computers at the time, C took over, which was a shame because it is one of the least safe languages, but networking hadn't become common yet and people wanted to save every processor cycle possible and didn't want to waste any on safety. I think this resurgence of focus on safety could bring some life back to this line of ALGOL descendants. Of course Ada is still used worldwide in high reliability systems such as aircraft systems, air traffic control systems, and spacecraft. There is Ada code in NASA's Artemis project.
I liked the following blog post on the topic (coincidentally... it reps one of my favourite languages, hehe):
"IM READY: LET THE 100 YEAR PROGRAMS BEGIN: Exploring Standard ML's robustness to time and interoperability"
http://len.falken.directory/p-lang/100-year-programs.txt
> Fortran is significantly faster than C, for instance.
That's news to me. Languages that are close enough to the hardware and allow for inline assembly can probably never be described as slow overall. Perhaps the author refers to some specific libraries?
> Java is the most recent popular general-purpose language
This post was written in 2022. Does the author not know about Python? Javascript? Rust? C#? and a bunch of others?
Weird.
> Fortran is significantly faster than C, for instance.
AFAIK the main reason is pointer aliasing, e.g. you may need to sprinkle C code with the restrict keyword to work around the issue. "Significantly faster" is debatable of course, I bet it's possible both in Fortran and C to write code that performans equally bad.
So you're saying that the aliasing defaults of C are an issue. Well... I suppose, but - not `restrict`ing your pointers in the hottest part of the code is a bit like not compiling with optimizations on.
1 reply →
>> Java is the most recent popular general-purpose language > This post was written in 2022. Does the author not know about Python? Javascript? Rust? C#? and a bunch of others?
I had to double check this - but Python is several years older than Java. Wikipedia lists 1991 for its first release, vs 1995 for Java.
That said, I felt like Python became really well-known much later than Java (which had massive hype and enthusiasm in the 90s) - so I do actually agree with you listing it here.
> > Java is the most recent popular general-purpose language
> This post was written in 2022. Does the author not know about Python? Javascript? Rust? C#? and a bunch of others?
Maybe it's phrased weirdly, but Java is hugely popular and widely deployed, it's hard to argue against that.
As one data point, here is comparing Java, Python, JavaScript, C# and Rust on Google Trends:
https://trends.google.com/trends/explore?cat=31&date=all&q=J...
Unsurprisingly, Java is still the most searched term in the "Programming" category. JavaScript is catching up, but still have some way to go. Rust barely registers.
Worth keeping in mind is that what's popular in startup circles (like I'm guessing a lot of HN users come from) isn't what's popular in the 90% other types of businesses.
Rust isn't popular, C# isn't general-purpose and the other two are older than Java.
> Rust isn't popular
Fair enough - it's more fashionable that popular :-) It did close TIOBE's top 20 for 2022: https://www.tiobe.com/tiobe-index/ (and that includes non-general-purpose languages)
... but you're right in that I could have probably chosen Go or TypeScript. The point is that Java is not remotely the latest popular general-purpose programming language.
> C# isn't general-purpose
It is, see definition: https://en.wikipedia.org/wiki/General-purpose_programming_la...
> and the other two are older than Java.
Javascript was named after Java... see: https://en.wikipedia.org/wiki/JavaScript
As for Python - you're technically right, but it only became popular after Java already was.
I am curious, why do you think C# is less general-purpose than Java?
4 replies →
As a user, i care more about portability between those programming languages. It's the best of both worlds.
Nowadays, i tend to focus on polyglot programming, where i can pick the best tool for the job, it's the true Single Responsibility Principle that works for me.
I also love language wars, because from them, i can know the tradeoffs of each language more clearly.
Human language doesn’t stay the same over 100 years. Is it reasonable to expect that from a computer language?
To be fair, they are different. But, I think close enough to make the analogy stand.
Different idioms, ways of expression arise constantly: come into and out of vogue. Human languages aren’t rigorously defined like computer languages must be. Human languages are loosely specifically by the amorphous collection of “all speakers” (whatever that means). And they evolve along those lines with each generation reshaping the language. Computer languages also change but they tend to hold onto their baggage much more. E.g. C is drowning in all the unchangable legacy from the 70s and 80s.
Is a 100 yr programming language possible? Sure. But it’s gonna feel a lot like you’re a scholar in 2023 writing a modern academic technical paper in Latin or Early Modern English. Doable, yes. But comical.
In fact, this is arguably already the state of C. Very useful programs are still written in it. And it feels like Early Modern English. Does that matter? Idk. But as the current trend in programming goes, most young engineers won’t be happy with this dynamic.
Personally, I suspect we get it just because we have so many damn programs in the world. But I can’t imagine anyone will count it as a success.
It's a very interesting article, but the lack of citations backing up some of the big claims make me doubt its premises. It's a shame because the author seems to know his programming language history. The most important IMO is this point:
> Fortran is significantly faster than C
Then later:
> Is Performance Necessary for a Hundred-Year Language?
> Fortran, one of the oldest thriving languages, lives and dies on performance. So that’s a check mark in the “yes” column.
I did some googling and found that there are some design choices that might make Fortran faster, but it's highly debatable whether it's "significantly faster than C".
Given how hardware has advanced over the years, "performance" just seems a weird feature to look for in a 100 year language too. You don't build performant software by building it in Fortran or C, you do it by making sensible design choices and writing thoughtful code, and not over-architecting or over-complicating provisioning.
IMO C is obviously going to be a 100 year programming language, and I came away a bit disappointed the article didn't quite fully claim that.
An interesting and thought provoking read nonetheless.
In the past, Fortran was always faster than C in scientific computation applications, mainly because many optimizations are prevented when compiling C, by the risk of pointer aliasing.
In modern C, such optimizations are possible by using "restrict".
Nevertheless, few C programmers bother to write "restrict" wherever it should be used, so casually written C code remains slower than similarly written Fortran code.
Moreover, modern Fortran has a lot of builtin operations for multidimensional arrays, which can be implemented in an optimized way by the compiler.
Writing simple C code cannot achieve a comparable effect.
Achieving a similar performance in C would require a special optimized library and this would result in extremely verbose and ugly C code. It is really impractical to do this in C and C++ must be used, with operator overloading, iterators and templates being needed to write a custom library that can match or exceed the builtin features of Fortran.
> Writing simple C code cannot achieve a comparable effect.
Can you provide a citation or source on this please?
1 reply →
I would disagree that C is going to be a 100 year programming language. I think the recent focus on safety is going to not only turn businesses away from using C but that they will active push C out of their code bases in the quest for safety. C was fine back before networking was everywhere. Today, it's a disaster waiting to happen in every C program. Sure, there are C programmers who claim they can write safe C code, but the numerous hacks prove they can't. Why wouldn't a programmer prefer a language that handles most of the safety for you, and is designed to allow for better optimization to boot?
I don't think I'd claim anything about what computers are "obviously" going to be in 50 years' time. AI, quantum computing, more advanced BMIs (thought-controlled computers etc.) and who-knows-what could disrupt everything we know in that time easily. As could civilisational collapse, though I'm certainly willing to bet there'll still be computers and programming languages of some sort in another 50 years.