In the world of programming languages, nothing's new; all of the good stuff was thought of by 1970. That's basically the extent of what I learned in college.
This is absolutely not true. A lot of great work has been done in the past decade on interpreters and speeding them up. Modern interpreters are just compilers where you use the code immediately (most are direct threaded byte-code interpreters), so all that work counts.
It is strange that modern software engineering en masse took such a huge step backwards in the C++ era of the late 80's and early 90's, with tools falling back to primitive levels and languages becoming much less forgiving. It's only now that we're finally returning to the state of the art from 20 years ago.
Of course, some people bucked the trend and used these somewhat neglected technologies despite a lack of public popularity. Paul Graham is one of them, and he ended up doing pretty well for himself. :)
I don't think that's 100% true. There are some good new ideas emerging due to new machine models: Erlang for example, which is like the machine language of concurrency. And Haskell wasn't really anticipated by the 70's, nor is much of Fortress -- especially the methods they're using for parallelism.
All the paradigms were discovered by the 70's, and a language is only really the View/Controller for the paradigm. At least that's true as far as what's currently published.
In the world of programming languages, nothing's new; all of the good stuff was thought of by 1970. That's basically the extent of what I learned in college.
This is absolutely not true. A lot of great work has been done in the past decade on interpreters and speeding them up. Modern interpreters are just compilers where you use the code immediately (most are direct threaded byte-code interpreters), so all that work counts.
It is strange that modern software engineering en masse took such a huge step backwards in the C++ era of the late 80's and early 90's, with tools falling back to primitive levels and languages becoming much less forgiving. It's only now that we're finally returning to the state of the art from 20 years ago.
Of course, some people bucked the trend and used these somewhat neglected technologies despite a lack of public popularity. Paul Graham is one of them, and he ended up doing pretty well for himself. :)
I don't think that's 100% true. There are some good new ideas emerging due to new machine models: Erlang for example, which is like the machine language of concurrency. And Haskell wasn't really anticipated by the 70's, nor is much of Fortress -- especially the methods they're using for parallelism.
Erlang isn't exactly new. Development started over 20 years ago and it has been open source for 10 years.
http://www.erlang.org/course/history.html
1 reply →
some exceptions i can think of :
clos and the meta object protocol - the idea of leaving the language open for the user to change by using a metaclass
hygienic macros
first class continuations
monads
functional reactive programming
the first few developed in the 80's while the last is quite recent
All the paradigms were discovered by the 70's, and a language is only really the View/Controller for the paradigm. At least that's true as far as what's currently published.