Comment by gnosis

15 years ago

"So, how come Ruby is more marketshare than, say, SML? Could it possibly be that most people just find it easier?"

Marketing. Hype. And the familiarity of Ruby's Algol-like syntax.

Recall that before Ruby (and Python) became popular, the scripting language world was ruled by Perl. And most programmers were pretty happy with that. Perl was a mature language. It had tons of libraries. And it generally did what people wanted.

Then, out of nowhere, started a litany of "Perl sucks!" screeches, mostly coming out of the mouths of Python and Ruby fans. By now they've repeated that mantra so often that it's become dogma throughout much of the rest of the programming world, even (and often especially) for people who've never even written a line of Perl.

Ruby and Python took that anti-Perl hatred and rode it all the way to the bank.

Sure, Ruby also had Rails.. which was yet more hype. PHB's started ordering their websites to be written in it, despite having not the faintest clue what it was. Much the same happened with Java.

As for Lisp and SML. They suffer from being "just too weird".

It'll be interesting to see what happens with F# in the long run, as it suffers from much the same "weirdness" as SML (being based on OCaml, which itself was based on SML). Will having a megacorp behind it make enough of a difference?

Then, out of nowhere, started a litany of "Perl sucks!" screeches

Nowhere? Haha, I was there and I'll tell you how it was: a whole bunch of people who'd started writing Perl in 1994 for CGI scripts had cause to revisit their own work a year or 5 years later to maintain it or to add features and found it utterly incomprehensible. The experience burnt us so badly that we all went looking for languages that were less "write only". Python and Ruby had been around for a while at that point, but they got a huge boost from people fleeing Perl. And most of those people are still using Python and Ruby today.

Perl was and still is a great language for what it was intended for: automating system administration tasks, and reporting on them, stuff you can do in 50 lines of code. But for 50,000 line applications worked on over 10 years by dozens of different people, it was never a good choice.

  • "a whole bunch of people who'd started writing Perl in 1994 for CGI scripts had cause to revisit their own work a year or 5 years later to maintain it or to add features and found it utterly incomprehensible."

    You can write garbage in any language. If these people found their own code incomprehensible, it's their own fault.

    "But for 50,000 line applications worked on over 10 years by dozens of different people, it was never a good choice."

    I've yet to be convinced that Ruby or Python are any better. In many ways they're even worse.

    • Yes, you can write anything in any language that's Turing complete, but that's not the point. Perl's mantra is "there's more than one way to do it", and so you get code that does it a million different ways. Python emphasizes there being only one right way, and so Python code is inherently more maintainable.

      1 reply →

"Out of nowhere"? One of the perennial topics of the mid to late 90s was the idea that Perl needed major fixes to remove common pitfalls (e.g. the object model, syntax warts (nested structures are just horrid), etc) and be more suitable for large projects.

Years passed.

Perl 6 was talked about.

Years passed.

Nothing happened. People gave up and moved on. The same thing's happened to PHP - years of broken things not being fixed eventually causes people to distrust the language's future and prevented it from moving into new niches.

Lisp is more complex, with far more redeeming features, but by now it shares the same failed-promise challenge: I've been writing software for a couple decades, so I missed out on the early history of Lisp, but for most of that time I've kept hearing about how great it is and how it'll soon be as productive as the inferior languages people actually use to get work done - and that point's always been in the future, never the present. Clojure is the only potential exception which I see to that and, to be honest, a large part of that would be due to the Java community's foundation.