Comment by warner25

2 years ago

How exactly is it that webpages with a couple MB of JavaScript or CSS or whatever so reliably cause our CPUs and fans to go nuts when much larger and more complex programs on our machines don't?

I just don't have a good intuition for what's happening here. In these discussions, people always talk like a webpage with 1 MB of JavaScript is a monstrosity, which, yeah, makes sense from an absolute perspective; it takes a lot of lines of source code to fill a text file up to 1 MB. But from a relative perspective, I have a bunch of programs on my machine that take up hundreds of MB of storage, and some do heavy scientific computations, but my laptop fan isn't pegged out most of the time, until I visit a page on reddit.com (so now I always make sure to use old.reddit.com instead).

I have a graduate-student understanding of computer systems, but again I just don't have a strong intuition for what's happening here. Can someone explain?

I mean, there's a difference between the size of a codebase and its efficiency. I could write a ten line program and peg a CPU, and write a ten thousand line program that doesn't cause and trouble at all.

  • Sure, I get that. But is the code behind something like reddit.com really that poorly written? Like a buggy, half-completed homework assignment that I turned in as a freshman CS student? If so, that seems to be an entirely different problem from the one discussed here.