← Back to context

Comment by snickerer

1 day ago

Allowing scripting on websites (in the mid-90s) was a completely wrong decision. And an outrage. Programs are downloaded to my computer and executed without me being able to review them first—or rely on audits by people I trust. That’s completely unacceptable; it’s fundamentally flawed. Of course, you disable scripts on websites. But there are sites that are so broken that they no longer work properly, since the developers are apparently so confused that they assume people only view their pages with JavaScript enabled.

It would have been so much better if we had simply decided back in the ’90s that executable programs and HTML don’t belong together. The world would be so much better today.

> Programs are downloaded to my computer and executed without me being able to review them first—or rely on audits by people I trust

Would've been cool if we could know if site X served the same JS as before. Like a system (maybe even decentralized) where people could upload hashes of the JS files for a site. Someone could even review them and post their opinions. But mainly you'll know you're getting the same JS as before - that the site hasn't been hacked or that you're not being targeted personally. If a file needs to update, the site could say in the changelog something like "updated the JS file used for collapsing comments to fix a bug". This could be pushed by the users to the system.

Especially important for banking sites and webmail.

Stepping back, it's pretty ridiculous that I need to download executable code, often bloated, solely to view read-only content. Just render the thing on the backend and send it to the client.

  • But the web-dev-hype people told me that JS-heavy SPA’s (and associated designs) were faster and better for the user!

    I didn’t bother validating this, but I’m sure they wouldn’t lie or misinterpret!!

    • For me personally the most infuriating example of this is the Azure Updates[1] page, which in my job I need to check nearly daily to see what's reaching EoL, what's new, etc...

      A couple of years ago they redeveloped it as a SPA app.

      The original server-rendered version of it worked just fine, but it "had" to be made into an interactive client-side monstrosity that loads many times slower for "reasons".

      It doesn't even load successfully about a quarter of the time. It shows items in reverse order (entries from 2013 first), which is some sort of async loading bug. They will never fix this. It's been there for two years already, it'll be there for a decade more, mark my words.

      Then, it takes about a minute to load sometimes on a poor connection.

      The links are JavaScript and don't allow "open in new tab".

      Etc...

      All of this to enable client-side filtering, which is a non-feature nobody ever wanted. A simple server-side filter capability would do the same thing, faster.

      And anyway, the filtering is broken! If click the "New or updated" filter, it drops down an empty selection with no options. Clicking anything else doesn't change what is shown!

      While developing this over-engineered monstrosity, they took the original site offline for "maintenance!"

      Hilariously, despite Azure having multiple CDN products, the Azure Updates page doesn't correctly use their own CDN and marks almost everything as "no-cache; no-store" causing 2.5 MB (after compression) to be re-transferred every time, despite using unique signed URLs with SHA256 hashes in them!

      This is the state of web-dev in the 2020s: A multi-trillion-dollar software company can't hire developers that know anything else other than SPA web app development!

      This commonly used page has spectacularly poor web engineering, and this is from a company that sells a web app platform, a CDN, and the ASP.NET web app development framework!

      If they can't get it right, who can!?

      [1] https://azure.microsoft.com/en-au/updates/

      3 replies →

> Programs are downloaded to my computer and executed without me being able to review them first—or rely on audits by people I trust.

JavaScript and WebAssembly programs are always executed in a sandboxed VM, without read access to the host OS files (unless, of course, you grant it).

Enabling scripting was a necessary step for interactive websites. Without it, a full page load would be required every time you upvote a Hacker News comment. In my opinion, the real problem is that browsers allow too many connections to third-party domains, which are mostly ads and trackers. Those should require user-approved permissions instead of being the default.

Why can't MY browser send some random JS to THEIR website? If it's safe for me to run some stranger's code, should it be safe for strangers to run my code?

Disable not just JavaScript, but also CSS. I'm not kidding. Many websites actually contain all the content in HTML but use CSS to hide and then use JavaScript to show it again.

If scripting wasn't allowed, we'd probably all have a different browser that allowed it - probably wrapped in a Flash wrapper.

There is obviously huge demand for scripting on websites. There is no one authority on what gets allowed on the web, if the existing orgs didn't implement it, someone else would have and users would have moved over when they saw they could access new more capable, interactive pages.

The 49MB webpage just shows what our priorities are. It shows the target audience has fast internet that can load this without issues. On my average home connection in Australia, I can download a 49MB page in 0.3 seconds. We spend time optimising for what matters to the end user.

  • Is the page actually done downloading 0.3 seconds after it starts? Or is it just (your Internet speed) / 49MB = 0.3 seconds?