Comment by krapp

1 year ago

>Makes Brendan Eich look like a total clown in comparison.

To be fair, Brendan Eich was making a scripting language for the 90's web. It isn't his fault Silicon Valley decided that language needed to become the Ur-language to replace all application development in the future.

Most of the blame should go to Netscape management. They didn't give Eich much time, then burst in before he was done and made him copy a bunch of things from Java. (The new language, codenamed "Mocha" internally, was first publicly announced as "LiveScript", and then Sun threw a bunch of money at Netscape.)

IIRC, Eich was quite influenced by Python's design. I wish he'd just used Lua - would likely have saved a lot of pain. (Although, all that said, I have no idea what Lua looked like in 1994, and how much of its design has changed since then.)

  • https://news.ycombinator.com/item?id=1905155

    If you don't know what Lua was like then, don't wish that I'd "just used Lua".

    Other issues include Netscape target system support, "make it look like Java" orders from above, without which it wouldn't have happened, and more.

    • Oh hi Yoz! LTNS! Hi Brendan!

      It sounds like you're saying Yoz got the sequence of events wrong, and that MILLJ was a necessary part of getting scripting in the browser? I sort of had the impression that the reason they hired you in the first place was that they wanted scripting in the browser, but I wasn't there.

      I don't think Lua was designed to enforce a security boundary between the user and the programmer, which was a pretty unusual requirement, and very tricky to retrofit. However, contrary to what you say in that comment, I don't think Lua's target system support or evolutionary path would have been a problem. The Lua runtime wasn't (and isn't) OS-dependent, and it didn't evolve rapidly.

      But finding that out would have taken time, and time was extremely precious right then. Also, Lua wasn't open-source yet. (See https://compilers.iecc.com/comparch/article/94-07-051.) And it didn't look like Java. So Lua had two fatal flaws, even apart from the opportunity cost of digging into it to see if it was suitable. Three if you count the security role thing.

      2 replies →

> It isn't his fault Silicon Valley decided that language needed to become the Ur-language to replace all application development in the future.

Which remains one of the most baffling decisions of all time, even to this day. Javascript is unpleasant to work with in the browser, the place it was designed for. It is utterly beyond me why anyone would go out of their way to use it in contexts where there are countless better languages available for the job. At least in the browser you pretty much have to use JS, so there's a good reason to tolerate it. Not so outside of the browser.

> To be fair, Brendan Eich was making a scripting language for the 90's web.

He was, and he doesn't deserve the full blame for being bad at designing a language when that wasn't his prior job or field of specialization.

But Lua is older so there's this element of "it didn't need to be this bad, he just fucked up" (And Eich being a jerk makes it amusing to pour some salt on that wound. Everyone understands it's not entirely serious.)

"Silicon Valley" is not an actor (human or organization of humans) that decided any such thing. This is like saying a virus decides to infect a host. JS got on first, and that meant it stuck. After getting on first, switching costs and sunk costs (fallacy or not) kept it going.

The pressure to evolve JS in a more fair-play standards setting rose and fell as browser competition rose and fell, because browser vendors compete for developers as lead users and promoters. Before competition came back, a leading or upstart browser could and did innovate ahead of the last JS standard. IE did this with DHTML mostly outside the core language, which MS helped standardize at the same time. I did it in Mozilla's engine in the late '90s, implementing things that made it into ES3, ES5, and ES6 (Array extras, getters and setters, more).

But the evolutionary regime everyone operated in didn't "decide" anything. There was and is no "Silicon Valley" entity calling such shots.

  • > "Silicon Valley" is not an actor (human or organization of humans) that decided any such thing.

    Oh come on, you understand full well that they're referring to the wider SV business/software development "ecosystem".

    Which is absolutely to blame for javascript becoming the default language for full-stack development, and the resulting JS-ecosystem being a dysfunctional shitshow.

    Most of this new JS-ecosystem was built by venture capital startups & tech giants obsessed with deploying quickly, with near-total disregard for actually building something robustly functional and sustainable.

    e.g. React as a framework does not make sense in the real world. It is simply too slow on the median device.

    It does, however, make sense in the world of the Venture Capital startup. Where you don't need users to be able to actually use your app/website well. You only need that app/website to exist ASAP so you can collect the next round of investment.

    • Oh come on yourself.

      Companies including Bloomberg and Microsoft (neither in or a part of Silicon Valley), also big to small companies all over the world, built on JS once Moore’s Law and browser tech combined to make Oddpost, and then gmail, feasible.

      While the Web 2.0 foundations were being laid by indie devs, Yahoo!, Google, others in and out of the valley, most valley bigcos were building “RIAs” on Java, then Flash. JS did not get some valley-wide endorsement early or all at once.

      While there was no command economy leader or bureaucracy to dictate “JS got on first but it is better to replace it with [VBScript, likeliest candidate]”, Microsoft did try a two-step approach after reacting to and the reverse-engineering JS as “JScript”.

      They also created VBS alongside JS, worked to promote it too (its most used sites were MS sites), but JS got on first, so MS was too late even by IE3 era, and IE3 was not competitive vs. Netscape or tied to Windows. IE4 was better than Netscape 3 or tardy, buggy 4 on Windows; and most important, it was tied. For this tying, MS was convicted in _U.S. v. Microsoft_ of abusing its OS monopoly.

      Think of JS as evolution in action. A 2024-era fable about the Silly Valley cartel picking JS early or coherently may make you feel good, but it’s false.

I don't think JavaScript will replace all application development in the future. WebAssembly will displace JavaScript. With WebAssembly you can use whatever language you like and achieve higher performance than JavaScript.