Comment by yewenjie
4 years ago
Quick question - how does the memoized DOM compare with no-virtual-dom-at-all approach of something like Svelte?
4 years ago
Quick question - how does the memoized DOM compare with no-virtual-dom-at-all approach of something like Svelte?
https://krausest.github.io/js-framework-benchmark/2020/table...
This lets you choose from multiple frameworks - comparing svelte against a few react variations, what I saw was that svelte was always fastest, but usually by a factor less than 2 (any react was 1.x times slower than the svelte).
The imba vs react numbers (from the article at https://www.freecodecamp.org/news/the-virtual-dom-is-slow-me...) shows a 30-40x speed difference.
I have to echo somebee about that benchmark suite. It's fairly well known in the framework-bulding community that this is not a very good benchmark (just ask Boris Kaul of Ivy.js, Ryan Carniato of Solid.js, Leon Sorokin of domvm, etc etc).
Some "frameworks" achieve good numbers there by being utterly unusable in real life, others miss the spirit of the benchmark completely (e.g. submitting a non-keyed implementation as keyed thereby gaining an unfair/misleading advantage), and as somebee said, the benchmark itself largely measures repaint time (and does so in a less than ideal way, by using setTimeout macro queue as a proxy for repaint time measurement, in band, instead of instrumenting performance via CDP). It lacks rigor in many ways (the most blatant was that initially it considered keyed and non-keyed implementations on par, but there are other issues such as putting a lot of weight into DOM creation compared to e.g. array diff, or as somebee said, not measuring low repaint load diffs)
IMHO, it only has two things going for it: a) it has a lot of frameworks listed b) it does at least attempt to measure repaint times unlike other benchmarks that only measure JS time (which has become somewhat irrelevant since V8 et al now offload repaints out of the JS thread)
Tbh, I think the js-framework-benchmark is flawed. It mostly tests the performance of the browser. I should write a whole blog post about this. Just as an example, all the table benchmarks uses a table with non-fixed width, which results in a full repaint AND layout of the whole table+page whenever a cell changes. If you change the table to a fixed width (as all real tables are) the relative difference between the frameworks increase by a factor of 5 or more.
And when you benchmark the speed of creating 10000 dom elements in an instant, less than 5% of the time should really be spent inside the framework one is supposed to test.
I stand by my claim in the mentioned article that tiny changes to a larger dom tree is a far better indicator of real world performance than anything else. Here Imba is really orders of magnitudes faster than react.
The last time I tested it, Imba was more than 10x faster than Svelte as well, but I'm not proficient enough in Svelte to claim that as a fact, and I have tremendous respect for Rich Harris and everything he's done with Svelte and other libraries.
Setting aside performance comparisons, how does Imba's approach compare to Svelte's from a design perspective? From your Meet the Memoized DOM article, I take it that Imba is basically converting declarative code into imperative code that mutates the DOM – on first glance, that sounds very similar to Svelte's compiler-driven approach.
Are the two strategies as similar as they sound, or am I misunderstanding something?
This is a cool project somebee. Interested to explore more.
On benchmarking: I went through the same concerns and ended up building a little benchmarking tool for a simple reactive UI library I'm working on. It's not super user-friendly yet but doing a good job of profiling tasks.
You can write custom benchmarks by clearly separating pre-setup work than relying on ready-made benchmarks (a bit of a pain initially, but helps a lot to fine-tune at unit-level going forward).
It uses Chrome DevTools Protocol(CDP) through Puppeteer and allows to analyze execution durations separately (Scripting, Layout, Paint, etc). Plus, it saves raw JSON profiling data, so you could import & examine it visually on DevTools Performance Tab's Timeline.
Think it will be helpful: https://github.com/dumijay/pfreak This is how the results look like: https://caldom.org/benchmark/
Flawed in what sense? I don't doubt that there are some conceptual drawbacks, and this only benchmarked chrome, not other browsers. I do think there's some utility in relative comparisons that have a standard/fixed baseline, as it would still seem to show what overhead a framework/library brings to the table.
FWIW, my initial impression of imba is that it's very impressive. I do think you rightly point out that, at this point, it may still be hard to leave larger ecosystems of react/vue/etc. DOM/UI speed of the project's JS toolkit generally has not been any meaningful impact in the projects I've worked on in the last several years - the data size and audience and app space just don't really call for it. However... as my needs change, imba will be something I'll revisit. Thank you.
5 replies →
This is a very misleading way to report those results (likely unintentionally). The JS Framework site includes benchmarks for imba-v1.5.2 and svelte-v3.29.4 and reports that they are equally fast (1.04 and 1.05). It shows both as similarly faster than React.
As described in your second link, that benchmark is timing something a bit different – and we don't know how well Svelte would perform on it (I'm guessing fairly similarly, since the overall approach seems similar. But there's no way to know without measuring.)