JavaScript-heavy approaches are not compatible with long-term performance goals

14 hours ago (sgom.es)

This title is very misleading, it should be "Why React is not compatible with long-term performance goals"

And I do agree generally. React uses an outdated rendering method that has now been surpassed by many better frameworks. Svelte/Sveltekit, Vue, and Qwik are the best examples.

People relying on bloated React packages is obviously not great but that is nothing to do with javascript itself.

The JS engines are all relatively fast now. And the document model of the current web provides major accessibility to both humans and search tools like SEO and GEO. JS is not my favorite language. I would rather the web was based on a statically typed language that had better error handling practices like Go. But this will most likely not happen any time soon as it would require every level of the ecosystem to adapt. Browsers, Frameworks, Developers etc.

Google would have to take the lead and implement this in chrome then enough developers would have to build sites using it and force safari and firefox to comply. It just isn't feasible.

If you want faster webapps just switch to sveltekit or vue or qwik. But often the ones choosing the framework for the project have not written much code in years, they know react is as safe option and used by everyone else so they follow along, if it gets slow its a "bug" causing it as they built apps that were "good enough" before using it.

  • > Google would have to take the lead and implement this in chrome then enough developers would have to build sites using it and force safari and firefox to comply. It just isn't feasible.

    This is not something you really want to happen for the health of the web tech ecosystem. I am surprised to see actual developers nonchalantly suggesting this. A type system for the web is not worth an IE 2.0

  • > Google would have to take the lead and implement this in chrome then enough developers would have to build sites using it and force safari and firefox to comply. It just isn't feasible.

    They already tried. It was called Dart and for a while there was an experimental flag to enable it directly in Chrome. It was cancelled and Dart was relegated to transpiling to JS/WASM.

    • Agreed that Dart was an effort to do exactly this.

      To be a bit pedantic... "relegated to transpiling to JS" makes it sound like that was something they had to do later, but in fact that was up front part of the transition plan. The idea was, write your code in Dart and it would work natively in browsers that (eventually) support it but would still work in everything else via JS transpilation. You're right that it was relegated to only working that way once it was rejected.

      The problem was (if the Wikipedia article is to be believed) that developers took offense to one company unilaterally deciding on the future scripting language of the web. A little ironic given that's how JS evolved in the first place, but fair enough. Later, WASM was created instead, with the idea that no particular language had to be chosen (and frozen) as the JS replacement. Dart compilation to WASM followed.

    • There was a complete fork to support it, Dartium.

      https://chromium.googlesource.com/dart/dartium/src/

      Dart is only around because it was rescued by AdWords team, which had just migrated from GWT into AngularDart, and wasn't happy to do yet another transition.

      Most of the key names from Smalltalk/Self background that were part of Dart 1.0 left the team at this point.

      It managed to stay around long enough to get the attention of the Flutter team, originally their prototype was done in JavaScript, which is also the reason why Dart 2.0 was a complete reboot of its type system, from dynamically typed with optional typing, to strongly typed language with type inference.

      And it isn't as if Flutter is a great success, Fuchsia never made it big, the Android team counter attacked with Kotlin's adoption, JetPack Compose, and Kotlin Multiplatform.

      2 replies →

  • The reason I'm a backend dev at the moment is that I looked at the React model and decided I didn't want anything to do with this insanity.

    I've been appalled by how long and how broadly the mass hysteria lasted.

    • And what's crazy is with ai the percentage of apps developed using react in comparison to all other frameworks has INCREASED over the last 3 years.

      It is truly mass hysteria, I would say that 95% of developers, project managers, and CTOs do not truly understand how these systems work under the hood, or at the very least are too scared and comfortable to try other systems. They just repeat the same things they hear and tell each other "react has a big ecosystem" "react is industry standard" "everyone uses react" "react was developed by facebook" "we will use react" "Developers only know react, how could we hire for a different framework?"

      In my mind its clear that the alternatives are massively better. When i visit certain websites I often get a random tingle that it uses svelte because its way faster and has better loading and navigation than other sites and when i check the devtools I'm almost always correct.

      I also get the same feeling sometimes when I hit a laggy slow webapp and I open the devtools and clearly see its a nextjs app.

      A dev team could be trained on svelte or vue in literally 3 days, and so long as they actually understand how HTML, JS, and css work under the hood they would increase their speed massively. People just don't want to take the risk.

      18 replies →

    • I managed to avoid learning React long enough to become a manager.

      I'm not saying I won't ever end up in engineering and won't ever have to learn it, but at least right now, it feels kinda like I got away with something.

    • I remained unemployed after I was laid off until a change of career happened. I refused to back to Angular/React/Vue. I was not going to shovel shit anymore for people who were clearly incompetent and really entitled about it.

    • Unfortunely it is hard to avoid when doing backed development with SaaS, iPaaS products, where JavaScript is in many cases the only extension language for backend widgets.

      I am kind of alright with Next.js, as it is quite similar to using SSR with more tradicionla backend stacks.

      Also at least node's C++ AddOns give me an excuse to occasionally throw C++ into the mix.

  • Didn't the React team present an exploration of Performance?

    - https://www.youtube.com/watch?v=uAmRtE52mYk

    It showed that it is fast already, faster with the compiler, and could even be way faster than signal based approaches, and even faster if it dropped some side-effects it does. Pulling from memory here.

  • Do you consider Angular to have a better rendering system? Or is it similar to React?

    Asking because I use Angular and want to learn other frameworks in case Angular is just as bad for long term.

    • I analyzed all the major frameworks years ago and went with Svelte when svelte 5 came out. It is all I use now, the development team is highly responsive and cooperative with the community, the founder Rich Harris has a pragmatic approach and tries to keep it as compatible with html/js while still providing great DX and massive efficiency. It natively supports serverside rendering that turns into a SPA once it hits the browser.

      Angular is probably more efficient than react but The DX is going to be worse than svelte or vue.

      If you want the fastest apps in environments with low ping then use Qwik.

      If you want the best overall experience, great customizability, compatability with any js package, great efficiency in most situations, and a great developer experience just go with Svelte (or vue maybe).

      Angular might be fine, I don't know. I never used it extensively. But I do know that svelte is the only framework that I like using now.

    • Angular uses directives for rendering, that allows compilers to optimize rendering.

      React model uses plain JavaScript expressions which are harder to optimize.

  • > React uses an outdated rendering method that has now been surpassed by many better frameworks. Svelte/Sveltekit, Vue, and Qwik are the best examples.

    I strongly disagree with this. Svelte/Solid/Vue all become a twisted mess eventually. And by "eventually" I mean "very very soon".

    The idea to use proxies and automatic dependency discovery looks good from the outside, but it easily leads to models that have random hidden interdependencies.

    React's rendering model is simplistic ("just render everything and then diff the nodes"), but it's comprehensible and magic-less. Everything is explicit, with only contexts/providers providing any "spooky action at a distance".

    And the recent React versions with Suspense neatly add the missing parts for async query/effects integration.

    > If you want faster webapps just switch to sveltekit or vue or qwik.

    If you want even worse webapps then switch to Vue and forgo being able to ever maintain them.

    • I've worked on large React and Solid codebases and don't agree at all. You can make a mess of either one if you don't follow good practices. Also dynamic dependency management is not just a nice to have, it's actually critical to why Solid's reactive system is more performant. Take a simple example of a useMemo/createMemo which contains conditional logic based on a reactive variable, in one branch a calculation is done that references a lot of reactive state while the other branch doesn't. In React, the callback will constantly be re-executed when the state changes even if it is not being actively used in the calculation, while this is not the case in Solid because dependencies are tracked at runtime.

  • I keep enjoying SSR with Java and .NET frameworks as much as possible since the 2000's, no need for Go alone.

    React is alright, when packaged as part of Next.js, which basically looks like React, while in practice it is SSR with JavaScript.

  • Saying that go has good error handling is… bold.

    Letting google implement a new language for the web would probably result in something even worse than javascript.

    • > Letting google implement a new language for the web would probably result in something even worse than javascript.

      Why? Javascript was famously written in ten days, was intended for casual scripting, and has not made up its mind whether it wants to be a Java or a Scheme. What could google invent that would be worse?

  • I'm tired of HN complaining about react for dumb reasons. To be clear, performance and bundle size matters for some things - if that's your complaint, whatever, i have no issue. My issue is there are whole host of use cases for which React is perfect, it has a functional declarative style which favours human understanding and correctness. This is a good thing. This is what the core of modern React gives you with functional components and hooks. You can do a lot with the core these days. Svelte and Vue are inferior for all the reasons React is functional and those other choices are imperative. I could go on but stop trying to make Svelte a thing, no one is buying.

    • I come from the native desktop/mobile world more so than from web, but I’d strongly contest the idea that declarative/functional is always better than imperative/OOP.

      My experience is that declarative starts getting mind-bendy and awkward past a certain point of complexity. It’s nice when it’s simple enough for everything to fit on a single screen without too much scrolling, but past that you need to start breaking it out into separate files. That’s not too bad initially, but eventually you end up with an infinite-Matryoshka-doll setup which sucks to navigate for anybody who doesn’t know the codebase because they have to jump through 7 files to drill down to the code there actually interested in.

      Also, the larger the project the more likely it is that you’re going to have to pull off acrobatics to get the desired behavior due to the model not cleanly fitting a number of situations.

      Declarative is solid for small-to-medium projects, but for more “serious” desktop class software with complex views and lots of lanes and such I’d be reaching for an imperative framework every time. Those require more boilerplate and have their own pitfalls, but scale much better on average for a dev team with a little discipline and proper code hygiene.

  • React's model is fine, it's just the wrong language. React implemented in Rust for example can be much faster, although you pay the cost differently for wasm <-> js communication.

A cogent article. But I think the biggest problem is that the DOM was built for documents, not apps. We know how to build a performant UI architecture: Qt, Java/Swing, Cocoa all have pretty similar architectures and they all ran fine on much poorer hardware than a modern browser on an M1. But unless you use WebAssembly, you can't actually use them on the browser.

When the industry shoehorns something into a tool designed for something else, yeah, performance suffers and you get a lot of framework churn with people trying to figure out how to elegantly cut steaks with spoons.

  • You can easily prove if the DOM is a performance bottleneck. DOM performance means one of two things: lookup/access speed and render speed. Render speed is only a DOM concern with regard to quantity of nodes and node layering though, as the visual painting to display is a GPU concern.

    To test DOM access speed simply compare processing speed during test automation of a large single page app with all DOM references cached to variables versus the same application with no such caching. I have done this and there is a performance difference, but that performance difference cannot be noticed until other areas of the application are very well optimized for performance.

    I have also tested performance of node layering and it’s also not what most people think. To do this I used an application that was like desktop UI with many windows that can dragged around each with their own internal content. I found things slowed down considerably when the page had over 10000 nodes displayed across a hundred or so windows. I found this was slower than equivalent desktop environments outside the browser, but not by much.

    Most people seem to form unmeasured opinions of DOM performance that do not hold up under tests. Likewise they also fail to optimize when they have the opportunity to do so. In many cases there is active hostility against optimizations that challenge a favorite framework or code pattern.

  • But most apps are documents, they are built to render data and text fields in a nice way for the consumer to use.

    You most certainly shouldn't be building graphs with table elements but JS has canvas and svg which make vectors pretty efficient to render.

    The document model provides good accessibility and the ability for things like SEO and GEO to exist.

    If you are making a racing simulator, then using HTML in no way makes sense, but for the apps that most of us use documents make sense.

    It would be nice if browsers implemented a new interpreted statically typed language with direct canvas/viewport rendering that was more efficient than javascript, but chrome would need to adopt it, then developers would need to actually build things with it. It seems like it would currently have to come from within the chrome team directly and they are the only ones that can control something like this.

    • The irony is that CSS works fairly okay for the small number of UI elements and web games that are decidedly not documents. Or perhaps that's not so much irony as filling in the gaps.

      1 reply →

  • While I don't have the performance bottleneck numbers of React, I don't think it's about Javascript vs. WASM here.

    I've seen/built some large Qt/QML applications with so much javascript and they all performed much better than your average React webapp. In fact the V8 / other browser Javascript engines also have JIT while the QML engine didn't.

    Comparing QtQuick/QML + JS to HTML + JS - both GPU accelerated scenegraphs, you should get similar performance in both. But in reality it is rarely the case. I suspect it might be the whole document oriented text layout and css rules, along with React using a virtual DOM and a lot of other dependencies to give us an abstraction layer.

    I'd love to know more about this from someone who did an in depth profiling of the same/similar apps on something like QtQuick vs. React.

  • > the biggest problem is that the DOM was built for documents, not apps

    I don't see the difference. They're both text and graphics laid out in a variable-sized nested containers.

    And apps today make use all the same fancy stuff documents do. Fonts, vector icons, graphics, rounded corners, multilingual text including RTL, drop shadows, layers, transparency, and so forth.

    Maybe you think they shouldn't. But they do. Of all the problems with apps in web pages, the DOM feels like the least of it.

  • It very possible to make lightning-fast React web UIs. DOM sucks, but modern computers are insanely fast, and browsers, insanely optimized. It is also very possible to make sluggish-feeling Qt or Swing applications; I've seen a number.

    It mostly takes some thinking about immediate reaction, about "negligibly short" operations introducing non-negligible, noticeable delays. Anything not related to rendering should be made async, and even that should be made as fast as possible. This is to say nothing of avoiding reflows, repeated redraws, etc.

    In short, sloppy GUI code feels sluggish, no matter what tools you use.

    • > but modern computers are insanely fast, and browsers, insanely optimized

      I think these facts have been used as excuses to shit up the app layer with slow mal-optimized js code.

      An example of a recent high performance app is figma, which blows normal js only apps out of the water. And it does so by using c++ wasm/webGPU for its more domanding parts, which is most of it

      I think we have to let go of the "just get more ram" approach and start optimizing webapp code like figma does

      3 replies →

    • > modern computers are insanely fast, and browsers, insanely optimized.

      You made the reverse point: If you need modern hardware to run a react app with the same performance as a svelte/vue/solid app on low-hardware, something is fundamentally wrong.

    • For most apps React is never going to be an issue when it comes to performance, unless you use it wrong or you use it for something that is not standard, like rendering fractals. It makes sense to analyse performance if your website is meant to reach absolutely everyone, like old smartphones. There's the issue that a single-page app can be bloated from the start, but that's also in the using it wrong category

    • > Anything not related to rendering should be made async,

      The thing with DOM interaction is that if you try to make it synchronous then it gets really fucking slow (reflow and friends). So you want it linearized for sanity reasons, but probably not sync.

    • This is the first sane comment in this entire comment section. So much nonsense in here, but I guess I should expect that when JavaScript is in the post title.

I think digressions about React dilute the message. Ok, we get it, react bad; but what is the actionable message here? What are the robust alternatives? There is a section on how changes to react (the introduction of hooks and of the concurrent mode) necessitated significant code changes almost amounting to rewrites; but which alternatives, other than vanilla web platform, can promise long-term stability? Which alternatives are better suited for which kinds of applications (roughly, an e-commerce site vs excalidraw)?

To me, the main problem is that inevitably, any SPA with dozens of contributors will grow into a multi-megabyte-bundle mess.

Preventing it it's extremely hard, because the typical way of developing code is to write a ton of code, add a ton of dependencies, and let the bundler figure it out.

Visualizing the entire codebase in terms of "what imports what" is impossible with tens of thousands of files.

Even when you do splitting via async `import()`, all you need is one PR that imports something in a bad way and it bloats the bundle by hundreds of kilobytes or megabytes, because something that was magically outsourced to an async bundle via the bundler suddenly becomes mandatory in the main bundle via a static import crossing the boundary.

The OP mentions it here:

> It’s often much easier to add things on a top-level component’s context and reuse them throughout the app, than it is to add them only where needed. But doing this means you’re paying the cost before (and whether) you need it.

> It’s much simpler to add something as a synchronous top-level import and force it to be present on every code path, than it is to load it conditionally and deal with the resulting asynchronicity.

> Setting up a bundler to produce one monolithic bundle is trivial, whereas splitting things up per route with shared bundles for the common bits often involves understanding and writing some complex configuration.

You can prevent that by having strong mandatory budgets on every PR, which checks that the bundle size did not grow by more than X kB.

But even then, the accumulation of 100s/1000s of PRs each adding 1KB, bloats the bundle enough to become noticeable eventually.

Perf work is thankless work, and having one perf team trying to keep things at bay while there are dozens of teams shipping features at a fast pace is not gonna cut it.

  • Exactly. To avoid dependency explosion, you need at least one of 1) most libraries built internally 2) great discipline across teams 3) significant performance investment/mandate/enforcement (which likely comes from business requirement eg page load time). I have rarely seen that in my limited experience.

I've gone all SSR (server-side render) with JSX using Astro or Elysia. If I need frontend logic, I just sprinkle in a little Htmx or roll my own inline js function. It makes debugging 1000% easier and the pagerank scores are usually amazing out of the box.

My 2 cents: I am not an experienced React dev, but the React compiler came out recently with React 19, which is supposed to do the same thing as Svelte's - eliminate unnecessary DOM modifications by explicitly tracking which components rely on what state - thus making useMemo() unnecessary.

Since the article still references useMemo(), I wonder how up-to-date the rest of the article is.

  • React has always been tracking what component relies on what state, independent of the compiler. That is one of the reason the rule-of-hook has to exist - it tracks if a component calls useState() and thus knows the according setState function that manipulates that particular state.

    Idk why people claim React is bloat, especially since you can switch to Preact (4kb) most of the time without changes if filesize is an issue for you.

    • > Idk why people claim React is bloat

      Because it's very hard to control it's rendering model. And the fact that multi billion dollar startups and multi trillion dollar companies hiring ivy league charlatans still have bloated low-performing websites written in react (that don't even need react...) clearly states the issues aren't that trivial.

      > React has always been tracking what component relies on what state, independent of the compiler.

      This still needs a complete parsing and analysis of your components and expressions. Which is why there is no single very performing UI library that can avoid directives.

      4 replies →

  • React compiler adds useMemo everywhere, even to your returned templates. It makes useMemo the most common hook in your codebase, and thus very necessary. Just not as necessary to write manually.

  • useMemo is for maintaining calculations of dependencies between renders, renders generally caused by state changes. React always tries to track state changes, but complicated states (represented by deep objects) can be recalculated too often - not sure if React 19 improves this, don't think so when reading documentation.

    on edit: often useMemo is used by devs to cover up mistakes in rendering architecture. I believe React 19 improves stuff so you would not use useMemo, but not sure if it actually makes useMemo not at all useful.

> I’ll focus on React and Redux in some of my examples since that is what I have the most experience with, but much of this applies to other frameworks and to JS-heavy approaches in general.

That's not a fair assumption. Frameworks like Svelte, Solid, Vue etc have smaller bundle sizes and rendering speeds that approach the baseline vanilla-js cost.

I'm all for criticising Javascript, but moving everything to the server isn't a real solution either. Instead of slow React renders (50ms?), every interaction is a client-server round trip. The user pays the cost of the paradigm on each interaction instead of upfront with an initial JS payload. Etc.

  • Yeah this article is only about React. But it makes sense that someone would think this way because many dev's think JS web apps==react only.

    The problem is react is "good enough" for most cases and the performance degradations happen slow enough that the devs/project leads don't see it until it's too late and they are already overly invested in there project and switching would be too compliated/costly for them.

    Svelte/kit and properly optimized packages solve almost all of the "problems" this article tries to bring up.

    • To be fair though, even with Svelte et al bundle sizes can balloon quickly when using UI toolkits and other libraries.

  • Plus Redux is horrible for performance, slows things down and overcomplicates everything.

    • I never understood why state management is overcomplicated in React. For some reason most people use something like Redux which has a very unintuitive API. There are other state management packages available that are so much easier to use and understand, like MobX.

      https://github.com/mobxjs/mobx

      3 replies →

Github was usable and fast, now it is slow. Guess what changed...

  • It's crazy slow. But it's also a closed source, Microsoft platform for Open Source, so it belongs in the trash anyway.

  • It was re-written in React and performance was annihilated. The React Virus annexed another victim and we have one more zombie web-site.

    • Yeah, but some project or program manager sure got a raise for delivering this nonsense "update".

      That was 100% a project sold to upper management as a benefit and modernization.

This seems to be more about React than Javascript (or indirectly about the DOM). React sitting on top a browser-style DOM would be slow in any language, while Javascript itself can be surprisingly fast, especially when sticking to the right subset. It "only" needs a big and complex JS engine to get to that kind of performance.

As an aside, I like their use of blockquotes+details/summary blocks for inserting "afterthoughts/addendums".

It's a nice touch, and it works pretty well. Partly because, as a design choice, it forces you to add the afterthought between paragraphs, so the interruption in reading flow is minimal.

agree w the long-term perf point, and i'd add: js-heavy apps also tend to hide the 'true' critical path (hydration + data + layout) so teams ship regressions without noticing. do you have a rule of thumb for when a page has crossed the line (tti/cls budgets, max js kB, etc)? and do you see islands/partial hydration as a real middle ground or just a temporary patch?

I am so grateful to the author for writing this article. For years I've been fighting a series of small battles with my peers who seem hell-bent on "upgrading" our e-commerce websites by rewriting them in React or another modern framework.

I've held the line, firm in my belief that there is truly no compelling reason for a shopping website to be turned into an SPA.

It's been difficult at times. The hype of new and shiny tools is real. Like the article mentions, a lot of devs don't even know that there is another way to build things for the web. They don't understand that it's not normal to push megabytes of JavaScript to users' browsers, or that displaying some text on a page doesn't have to start with `<React><App/></React>`.

That's terrifying to me.

Articles like this give me hope that no, I'm not losing my mind. Once the current framework fads eventually die out - as they always do - the core web technologies will remain.

  • I think shopping gets this in spades too because not all shopping sites are meant to be particularly sticky.

    It's one thing to browse the catalog at my leisure on gigabit networking, a 5k display and 16 CPU cores. It's another thing when I'm standing in Macy's or Home Depot and they don't quite have the thing I thought they have and I'm on my phone trying to figure out if I can drive half a mile to your store and get it. If you want to poach that sale your site better be fast, rather than sticky.

"Now’s a good time to figure out whether your client-side application should have a server-side aspect to it, to speed up initial renders."

My how the tables have turned!

Wonderful article. I do maintenance programming, and many of the problems you mention with typical react apps are also code maintenance nightmares. Managing a large number of fast moving third party dependencies will destroy your developer budget, but devs cant see it because they're "Best Practices"

Only a single passing mention of web components?

  • > You can use isolated JS scripts, or other approaches like progressively-enhanced web components

    How would one use "progressively enchanced" web components? Maybe I misunderstand the intention behind this statement, but web components are either supported or not. There doesn't seem to be some kind of progression.

In other news: water is wet. I genuinely don't understand how anyone is still pretending otherwise. Server-side rendering is so much easier to deliver in a performant way, yet it feels like it's being increasingly forgotten — or worse, actively dismissed as outdated. Out of convenience, more and more developers keep pushing logic and rendering onto the client, as if the browser were an infinitely capable runtime. The result is exactly what this article describes: bloated bundles, fragile performance, and an endless cycle of optimization that never quite sticks.

  • Pure client side rendering is the only way to get max speed with lowest latency possible. With ssr you always have bigger payloads or double network rounds.

    • That’s a laughable claim. SSR is objectively faster, since the client does nearly zero work other than downloading some assets. If the responses are pre-computed and sitting in server memory waiting for a request to come along, no client side rendering technique can possibly beat that.

      4 replies →

  • Server rendered HTML, htmlf endpoints and JQuery load was always the sweet spot for me - McMaster Carr[0] does the same thing behind the scenes and utterly destroys every "modern" webapp in existence today. Why did everything have to become so hard?

    0: https://www.mcmaster.com/

Why would performance be a goal?

If a native desktop app crashes, your users will gnash their teeth and curse your name. A web app? They shrug and refresh the page. They have been trained to do this for decades now, why would anyone believe this isn't intended behavior?

No one has a reasonable expectation of quality when it comes to the web.

  • > No one has a reasonable expectation of quality when it comes to the web.

    And that's exactly why I want native desktop apps to make a resurgence.

Eh, this argument falls apart for many reasons:

- His main example of bloated client-side dependencies is moment.js, which has been deprecated for five years in favor of smaller libraries and native APIs, and whose principal functionality (the manipulation and display of the user's date/time) isn't possible on the server anyway.

- There's an underlying assumption that server-side code is inherently good, performant, and well crafted. There are footguns in every single language and framework and library ever (he works for WordPress, he should know).

- He's right to point out the pain of React memoization, but the Compiler now does this for you and better than you ever could manually

- Larger bundle sizes are unfortunate, but they're not the main cause of performance issues. That'd be images and video sizes, especially if poorly optimized, which easily and immediately dwarf bundle downloads; and slow database queries, which affect server-side code just as much as browser-side code.

  • > There's an underlying assumption that server-side code is inherently good, performant, and well crafted

    To me it’s an assumption that server side code is going to be running on a server. Which is a known quantity and can be profiled to the nth degree. It’s extremely difficult to profile every possible device your site will run on, which is crucial with low powered mobile devices.

    > Larger bundle sizes are unfortunate, but they're not the main cause of performance issues. That'd be images and video sizes

    Not really, no. Large bundle sizes prevent the initialisation of the app, which means the user can’t do anything. By comparison images and videos download asynchronously and get processed on a separate thread. JS bundles also need to be parsed after being downloaded, if you pair a crappy Android phone with an 5G connection the parsing can literally take longer than the download.

  • > Larger bundle sizes are unfortunate, but they're not the main cause of performance issues. That'd be images and video sizes, especially if poorly optimized, which easily and immediately dwarf bundle downloads; and slow database queries, which affect server-side code just as much as browser-side code.

    In network terms JS tends to be downloaded at a higher priority than both images and video so has a larger impact on content appearing

    JS is also the primary main thread blocker for most apps / pages… profile any NextJS app and you’ll see what a horrendous impact NextJS has on the main thread and the visitor’s experience

I was never a big React fan myself. As someone who has used a lot of different JavaScript frameworks over many years, I can say confidently that it's not the best JS framework; especially nowadays due to bloat.

Yet it's better than anything available in any other programming language, on any other platform in existence.

Never bet against JavaScript. People have done this over and over and over again since it was invented. So many haters. It's like every junior dev was born a JavaScript hater and spends most of their career slowly working themselves to a state of 'tolerating JavaScript'.

JS was designed to be future-proof and it kept improving over the years; what happened is it improved much faster than people were able to adjust their emotions about it... These people even preached the JS hate to a whole generation of juniors; some of whom never even experienced JavaScript first-hand. It's been cool to hate JavaScript since I can remember.

JavaScript does have some bad parts, as does any other programming language, but the good parts of JS are better than anything else in existence. People keep trying to build abstractions on top (e.g. TypeScript) to try to distance people from JavaScript, but they keep coming back over and over again... And these people will never admit to themselves that maybe the reason they keep coming back to JavaScript is because it's pretty darn great. Not perfect, but great nonetheless.

It's hilarious that now we have Web Assembly; meaning you can compile any language to run in the browser... But almost nobody is doing that. Do people realize how much work was required to bring Web Assembly to the browser? Everyone knows it exists, it's there for you to use, but nobody is using it! What does that say? Oh, it's because of the bundle size? Common! Look at React, React bundles are so bloated! But they are heavily used! The excuse that JavaScript's success is a result of its browser monopoly is gone!

Enough is enough! The fact is; you probably love JavaScript but you're just too weak to admit it! I think the problem is that non-JS developers have got a big mouth and small hands...

  • > but nobody is using it! What does that say?

    It’s impossible to replace JS with WebAssembly because all state-mutating functions (DOM tree manipulation and events, WebGL rendering, all other IO) is unavailable to WebAssembly. They expect people to do all that using JavaScript glue.

    Pretty sure if WebAssembly were designed to replace JS instead of merely supplementing it, we would have little JS left on the web.

  • > but the good parts of JS are better than anything else in existence

    What you talking about?! I can't think of a single thing in Js that I could say is good.

    Okay, two big corporations have invested a lot of money and effort into making V8 and TypeScript, and now it's useful. But I don't consider it exactly part of Js.

  • You lost me at Typescript. Typescript is great not because it abstracts away any javascript functionality (it doesn't), but because it allows IDE integrations (mainly LSP) to better understand your code, enabling go-to-definition, hover docs, autocomplete, semantic highlighting, code actions, inline error messages, etc.

    But I agree many people are jumping on the javascript hate train without really understanding the modern web landscape.

  • Do people really hate JavaScript, or do they just hate the design choices and results that it seems to be correlated with?

    At the end of the day I’m using SaaS tools that are apparently written in React and I get astounded but how slow and heavy they are. If you are editing a page on our companies cloud-based wiki, I’ve seen my chrome RAM balloon from 3GB to 16G. A mistake was made somewhere, that I know.

Our webapp is nearly instant, and it's built on raw React with some sprinkling of Tanstack (their Local Collection DB is a masterpiece).

And our stack is intentionally client-heavy. We proactively synchronize all the relevant data and keep it in IndexedDB, with cross-tab coordination. The server is used only for mutating operations and for some actions that necessarily require server-side logic.

The issue with dependencies and the package size if valid, but it's also really not a big deal for performance unless you go WAAAY overboard. Or use crappy dependencies (Clerk, I'm looking at you, 2.5Mb for an auth library!?!?).

As for hydration, I found that it often slows down things. It's a valid approach when you're doing query-based apps, but when you have all the data locally, it just makes little sense. It also adds a ton of complexity, because now your code has to automagically run in multiple environments. I don't think it can really ever work reliably and safely.