Comment by prewett
11 days ago
A cogent article. But I think the biggest problem is that the DOM was built for documents, not apps. We know how to build a performant UI architecture: Qt, Java/Swing, Cocoa all have pretty similar architectures and they all ran fine on much poorer hardware than a modern browser on an M1. But unless you use WebAssembly, you can't actually use them on the browser.
When the industry shoehorns something into a tool designed for something else, yeah, performance suffers and you get a lot of framework churn with people trying to figure out how to elegantly cut steaks with spoons.
But most apps are documents, they are built to render data and text fields in a nice way for the consumer to use.
You most certainly shouldn't be building graphs with table elements but JS has canvas and svg which make vectors pretty efficient to render.
The document model provides good accessibility and the ability for things like SEO and GEO to exist.
If you are making a racing simulator, then using HTML in no way makes sense, but for the apps that most of us use documents make sense.
It would be nice if browsers implemented a new interpreted statically typed language with direct canvas/viewport rendering that was more efficient than javascript, but chrome would need to adopt it, then developers would need to actually build things with it. It seems like it would currently have to come from within the chrome team directly and they are the only ones that can control something like this.
The irony is that CSS works fairly okay for the small number of UI elements and web games that are decidedly not documents. Or perhaps that's not so much irony as filling in the gaps.
fairy okay is the key word.
Everything works fairly okay on modern hardware. I'm sure someone could build a 3d rendering engine using only table elements and css and it would run decently well.
There are hundreds of tools in the belt, people can use any of them to tighten down the screw, but it doesn't mean that they are the most efficient or best to use.
I would also say that a lot of web games are closer to documents than you think. A chess board could be seen as a document, it has tables and rows, the characters are just shaped different than the characters we write with.
Something like a racing sim again could be implemented in css but someone who actually understands how to use canvas is going to have a more efficient way to represent it.
While I don't have the performance bottleneck numbers of React, I don't think it's about Javascript vs. WASM here.
I've seen/built some large Qt/QML applications with so much javascript and they all performed much better than your average React webapp. In fact the V8 / other browser Javascript engines also have JIT while the QML engine didn't.
Comparing QtQuick/QML + JS to HTML + JS - both GPU accelerated scenegraphs, you should get similar performance in both. But in reality it is rarely the case. I suspect it might be the whole document oriented text layout and css rules, along with React using a virtual DOM and a lot of other dependencies to give us an abstraction layer.
I'd love to know more about this from someone who did an in depth profiling of the same/similar apps on something like QtQuick vs. React.
It's not about Javascript vs. WASM; it's the DOM. DOMless apps like Figma are much faster.
The slowness in a React app is not the DOM. Typical apps spend most of their time running component code long before any DOM mutations are committed.
If you look at DOM benchmarks it's extremely fast. Slow web pages come from slow layers on top.
1 reply →
You can easily prove if the DOM is a performance bottleneck. DOM performance means one of two things: lookup/access speed and render speed. Render speed is only a DOM concern with regard to quantity of nodes and node layering though, as the visual painting to display is a GPU concern.
To test DOM access speed simply compare processing speed during test automation of a large single page app with all DOM references cached to variables versus the same application with no such caching. I have done this and there is a performance difference, but that performance difference cannot be noticed until other areas of the application are very well optimized for performance.
I have also tested performance of node layering and it’s also not what most people think. To do this I used an application that was like desktop UI with many windows that can dragged around each with their own internal content. I found things slowed down considerably when the page had over 10000 nodes displayed across a hundred or so windows. I found this was slower than equivalent desktop environments outside the browser, but not by much.
Most people seem to form unmeasured opinions of DOM performance that do not hold up under tests. Likewise they also fail to optimize when they have the opportunity to do so. In many cases there is active hostility against optimizations that challenge a favorite framework or code pattern.
> To test DOM access speed simply compare processing speed during test automation of a large single page app with all DOM references cached to variables versus the same application with no such caching. I have done this and there is a performance difference, but that performance difference cannot be noticed until other areas of the application are very well optimized for performance.
This was my instinct. Remember, the author is a hammer. His role is to find performance issues in the frontend. His role is not to find performance bottlenecks in the whole system.
> the biggest problem is that the DOM was built for documents, not apps
I don't see the difference. They're both text and graphics laid out in a variable-sized nested containers.
And apps today make use all the same fancy stuff documents do. Fonts, vector icons, graphics, rounded corners, multilingual text including RTL, drop shadows, layers, transparency, and so forth.
Maybe you think they shouldn't. But they do. Of all the problems with apps in web pages, the DOM feels like the least of it.
> I think the biggest problem is that the DOM was built for documents, not apps.
The world wide web was invented in 1989, Javascript was released in 1995, and the term "web application" was coined in 1999. In other words: the web has been an application platform for most of its existence. It's wrong to say at this point that any part of it was primarily designed to serve documents, unless you completely ignore all of the design work that has happened for the past 25 years.
Now, whether it was designed well is another issue...
> Java/Swing
> performant UI architecture
Not sure if it’s a joke or something.
It very possible to make lightning-fast React web UIs. DOM sucks, but modern computers are insanely fast, and browsers, insanely optimized. It is also very possible to make sluggish-feeling Qt or Swing applications; I've seen a number.
It mostly takes some thinking about immediate reaction, about "negligibly short" operations introducing non-negligible, noticeable delays. Anything not related to rendering should be made async, and even that should be made as fast as possible. This is to say nothing of avoiding reflows, repeated redraws, etc.
In short, sloppy GUI code feels sluggish, no matter what tools you use.
> but modern computers are insanely fast, and browsers, insanely optimized
I think these facts have been used as excuses to shit up the app layer with slow mal-optimized js code.
An example of a recent high performance app is figma, which blows normal js only apps out of the water. And it does so by using c++ wasm/webGPU for its more domanding parts, which is most of it
I think we have to let go of the "just get more ram" approach and start optimizing webapp code like figma does
> I think we have to let go of the "just get more ram" approach and start optimizing webapp code like figma does
I think you're underselling how much work went into making Figma that way. You're talking about a completely different realm of optimization that most companies won't spend money doing, and most web programmers won't know how to do.
As long as there's no incentive to do otherwise, companies will slap together whatever they can and ship as many features as possible.
1 reply →
Exactly my point: if we're not sloppy, we can achieve amazing performance.
For most apps React is never going to be an issue when it comes to performance, unless you use it wrong or you use it for something that is not standard, like rendering fractals. It makes sense to analyse performance if your website is meant to reach absolutely everyone, like old smartphones. There's the issue that a single-page app can be bloated from the start, but that's also in the using it wrong category
> modern computers are insanely fast, and browsers, insanely optimized.
You made the reverse point: If you need modern hardware to run a react app with the same performance as a svelte/vue/solid app on low-hardware, something is fundamentally wrong.
> Anything not related to rendering should be made async,
The thing with DOM interaction is that if you try to make it synchronous then it gets really fucking slow (reflow and friends). So you want it linearized for sanity reasons, but probably not sync.
Not everyone can afford the “modern computers” you are talking about.
This is the first sane comment in this entire comment section. So much nonsense in here, but I guess I should expect that when JavaScript is in the post title.