← Back to context

Comment by throwaway2214

4 years ago

minified js is not greatly smaller than gzipped js, I think the whole minification thing is a swing and a miss and now we have to deal with source maps and shit, and build pipelines and etc

    $ ls -la
    -rw-r--r-- 1 jack 197609 330905 May  4 22:56 watch.js
    -rw-r--r-- 1 jack 197609 152172 May  4 22:55 watch.min.js

    $ gzip watch.js
    $ gzip watch.min.js
    $ ls -la
    -rw-r--r-- 1 jack 197609 43690 May  4 22:56 watch.js.gz
    -rw-r--r-- 1 jack 197609 32507 May  4 22:55 watch.min.js.gz

Of surprise to no one, Brotli does better on both:

    $ ls -l *.js
    -rw-r--r--  1 mrd  staff  330904  5 May 01:04 watch.js
    -rw-r--r--  1 mrd  staff  152172  5 May 01:10 watch.min.js
    $ brotli watch.js
    $ brotli watch.min.js
    $ ls -l *.br
    -rw-r--r--  1 mrd  staff  34461  5 May 01:04 watch.js.br
    -rw-r--r--  1 mrd  staff  27122  5 May 01:10 watch.min.js.br

If I were serving this content, and if my web server and all of my target browsers supported Brotli, I'd be somewhat more content to ship an un-minified + Brotli-compressed file than an un-minified + gzip'd one. I'm sure it's some rule of thumb stuck in my head from the Web 2.0 era, but a JavaScript payload in excess of 40KB crosses some warning line in my head. (Probably 40KB / ~4KB/s throughput on a good dial-up connection = 10s transfer time, about the longest you'd want to wait for even a pretty spiffy page to load.)

  • > I'd be somewhat more content to ship an un-minified + Brotli-compressed file than an un-minified + gzip'd one.

    Whoops, typo: I meant to say that I'd be somewhat more content to ship an un-minified + Brotli-compressed file than a minified + gzip'd one. That is, I'd be more happy to serve the 34.4KB watch.js.br than the 32.5KB watch.min.js.gz.

> and now we have to deal with source maps and shit

Yeah minification is only really for obfuscation. The small and unpredictable difference is absolutely not worth the ridiculous complex "solution" of source maps. Just the fact that your debugger really doesn't work right, is a deal breaker in and itself, not to mention all the time spent configuring and fighting with webpack.

I don't think any form of "compilation" i.e. bundling, transpiling, minification etc is needed at all. Javascript can already dynamically load (additional) code files when needed, I don't understand why you need to bundle it in the first place.

I don't buy that the http request overheads are so big that it motivates all this complexity, and in the average case a user don't use every single page of the application anyway, so by bundling everything you are always serving "too much", compared to just dynamically loading additional code.

Gzipped JS is generally much smaller than minified JS, but minified-then-gzipped JS is even more so. The minification (assuming gzip) doesn't make a much difference in this case only because the input file is not that large at all and compression algorithms have a natural bias for larger inputs. You can (rightly) claim it is bad to have a JS file large enough that the minification makes a difference after all, but you'd be moving a goalpost then.

True, but it also removes the comments and the whitespace, leading to slightly better performance and memory usage. There are also less bytes to gzip on the server side.

  • Slightly, but is it enough to warrant the extra steps?

    I don't think the difference is significant enough in this case.

    That said, I do think there should be an alternative to minification+gzipping, like e.g. a compiled version of JS that is more optimized than a browser's own JIT compiler can do. Mind you, that might up being a larger package than the JS source code.