Comment by zentiggr
5 years ago
I think I've had my own moment of clarity that spans both Forth(s) and Lisp(s) and explains why neither is as common as other languages.
In most common languages, there is a complicated base spec that covers many cases and defines a broad range of affordances, plus libraries and libraries that expand on an already fleshed out collection of tools and etc.
Forths and Lisps give you the core of an environment, and let/expect you to build on the foundation to create your own implementation. Like someone else in this thread said, N programmers, N dialects. Or, more accurately, every Forth program is its own DSL for accomplishing its work.
I think you can draw this ‘core of an environment’ parallel between Forth and Scheme: both are small languages and they emphasize growing the language to the problem domain [1]. Common Lisp, on the other hand, is a large language: implementations provide much more than a foundational core, and a fairly comprehensive list of libraries exists. I think RPG’s Worse Is Better highlights some of the reasons why CL isn’t as popular as other languages [2].
[1] https://youtube.com/watch?v=_ahvzDzKdB0
[2] https://www.dreamsongs.com/WIB.html
Off topic, but this (from RPG's Worse is Better) sounds very familiar:
> Part of the problem stems from our very dear friends in the artificial intelligence (AI) business. AI has a number of good approaches to formalizing human knowledge and problem solving behavior. However, AI does not provide a panacea in any area of its applicability. Some early promoters of AI to the commercial world raised expectation levels too high. These expectations had to do with the effectiveness and deliverability of expert-system-based applications.
You are giving technical merits way too much credit.
People will put up with whatever bullshit as long as there is demand and helps them get a job.
The thing is, UNIX was a massive success, and it happened to be written in C. Since then, all successful languages had to have a familiar syntax with the host language.
It was UNIX that killed the Lisp Machine (by being given away for free). Programming languages never got to play a role.
Unix wasn't given away for free; it was strapped by AT&T licensing and you needed hardware that certainly wasn't free, and not still not affordable to individual consumers. But Unix was a resource-efficient system that scaled down to cheaper hardware with less RAM.
Even the dyed-in-the-wool Lisp enthusiasts headed by Richard Stallman were compelled to reproduce Unix, even though their stated goal was to have a system running Lisp.
Stallman decided to reimplement Unix because it was popular, not because it was somehow better suited to the hardware.
One of the first GNU programs he released was indeed his "system running Lisp"; it was called Emacs.
1 reply →
It was given to universities for free iirc
Lisp peaked very early, and grew at a voracious pace compared to progress in hardware. The result was that it required "big iron". Lisp was a victim of the same process that killed the mainframes: a rebooting of the computer industry with cheap, but (initially) under-powered microcomputers, to which legacy systems were not able to migrate.
There also arose a new generation of hackers brought up on the new microcomputers who didn't care for, know or else even have access to legacy systems. As microcomputers showed signs of advancement, old hackers who had learned how to make things fit into small memories 15 years prior brandished their skills, which popularized tools like Pascal and C. Turbo Pascal for MS-DOS PC's fit a compiler and IDE into under forty kilobytes.
In the 1980's, people who wanted to use their Lisp techniques to deploy into the microcomputer market were faced with rewrites. A blatant example of this is CLIPS: an expert system written in C which retains the Lisp syntax of its predecessor. https://en.wikipedia.org/wiki/CLIPS . CLIPS was inspired by a Lisp-based system called OP5. But that itself had also been rewritten into Bliss for speed: https://en.wikipedia.org/wiki/OPS5 .
http://staff.um.edu.mt/jskl1/turbo.html#Lisp
How about both ... message from the past for the future
Why not say "Every ~~Forth~~ program is its own DSL for accomplishing its work."? For moderately complicated programs in any language you choose, it can take a long time to grok how the literal code relates to solving the conceptual problem. No language can build in every abstraction, and no programmer has time to learn them all.
I think you are close to part of an answer, but it isn't because Forth and Lisp expect one to do more work than other languages. If anything, they expect one to do less. The problem is programmers feel lost because there is no way to differentiate the bedrock of the language from higher abstractions. C has operators and statements and keywords that tell you there is nothing "underneath" what you are looking at. With Forth, everything is words. With Lisp, everything is lists.
To be fair, it is very common for Forth programmers to redefine the interpreter as they go. You literally change the language in your program. That's a very different expectation for other kinds of languages.
> You literally change the language in your program. That's a very different expectation for other kinds of languages.
I wonder about that. A few weeks back I read about a coroutines implementation in C, using plain C and lots of intricate preprocessor definitions:
https://www.chiark.greenend.org.uk/~sgtatham/coroutines.html
There are similar examples in just about any language out there. People use whatever tools the language ecosystem provides to change the language to fit some problems better. Some languages are easier to change and extend, some are harder, but that doesn't stop people from trying to do this anyway.
I think there's a level of familiarity with the language above which changing it is a natural thing to do. It can take years before you learn a "normal" language well enough to be able to do this, but with Forth, Scheme, Prolog, and the like, you're basically required to do this from the get-go. My intuition is that these languages simply target advanced, already experienced programmers, while completely ignoring the beginners. So it's more of the optimization for a different user-base, IMO. That would also explain how these languages are still alive, despite their communities being very small for the last 50 years.
> In most common languages, there is a complicated base spec that covers many cases and defines a broad range of affordances
Sure, if we ignore C.