Comment by panzerboiler
4 years ago
He does it "the right way™". Use the platform. Don't use any framework or generic library. Go straight to the point and code what you need, when you need it. Don't minify or bundle anything, and let the people who are learning and courious a straightforward way to connect the dots, without forcing them into a github repository with 90% of the code unrelated to the thing and existing just to glue 1000 pieces written by 10000 people together. Every essay by Bartosz is so top-notch and a such breath of fresh air! He gives me hope in humanity and I am immensely grateful for what he does.
I strongly disagree that this is "the right way". I think that the platform provides low level primitives that are _designed_ to have abstractions built upon them.
Doing it like this has the potential to be the most performant, but it does so in the same way as writing your programs directly in assembly is potentially performant.
I also don't think that the source code is particularly readable for me, and contains lots of magic numbers and very imperative code. I would personally find it a lot more readable if it was written in some sort of declarative way using a library, even if I have to look at a GitHub repo instead of view source.
> but it does so in the same way as writing your programs directly in assembly
> contains lots of magic numbers and very imperative code
Well, we really don't know if the code was written in this form by hand, don't we.
It could have been compiled into this, to use your words, "assembly with magic numbers and imperative" from much more elegant form. We may see this form only because this is what browsers understand.
I am not saying it was compiled, just speculating that seeing pure WebGL does not mean it was pure WebGL to begin with.
It was.
https://twitter.com/BCiechanowski/status/1522067904522428417
Graphics code tends to be imperative and have lots of magic numbers. I suppose it's the math-intensive nature of it.
Personally I'm not a fan of the magic numbers either but as I study more and more of it, it's everywhere
When there's physics, graphics and mathematics, there are magic numbers, which are results of formulas which needs to be computed once, or material properties, or nature's constants.
Also, nature and graphics works as an imperative parallel machine. So the code mirrors that.
This is not written deliberately this way. Code comes out like that when you strip all the libraries, fluff, and other less-related stuff.
I also write a scientific application, and yes, This is the way.
it depends if you are doing something to get paid, or to last, or to be really good. only in the first case do i ever consider a heap of abstractions
Abstraction is the only thing that makes any of our advancements possible. Not even the simplest of math theses could be proves without a “framework” of relevant lemmas, nor could you write even a single hello world without the layers upon layers of abstractions written carefully over the decades. Sure, there is also bad abstraction, but the problem is the bad part, not the concept itself.
Without abstractions you wouldn’t be able to read a text stored on a remote computer with accompanying style information displayed the same on both of our devices and with embedded 3D graphics doing the same thing on vastly differing devices be it a top of the line GPU or a simple low-end phone. Is it not abstraction?
5 replies →
This is so backwards.
1 reply →
On a scale of 1 to 10 how strongly are we talking here?
9.5 Your PR will be held up for at least a month with the back and forth.
> has the potential to be the most performant
It also has the potential to evolve in the most efficient way.
I mostly agree with you, but I don’t mind minification when appropriate, as it can serve a functional purpose with tangible end-user-friendly benefits (less downloaded over the network = faster response times).
But if you want to be friendly to the tinkerers, you could always host both the *.js and *.min.js versions, and have the webpage just pull the latter - anyone who wants the unminified source can remove the “min” part from the URI, while the majority of end users will still benefit from pulling the minified js.
minified js is not greatly smaller than gzipped js, I think the whole minification thing is a swing and a miss and now we have to deal with source maps and shit, and build pipelines and etc
Of surprise to no one, Brotli does better on both:
If I were serving this content, and if my web server and all of my target browsers supported Brotli, I'd be somewhat more content to ship an un-minified + Brotli-compressed file than an un-minified + gzip'd one. I'm sure it's some rule of thumb stuck in my head from the Web 2.0 era, but a JavaScript payload in excess of 40KB crosses some warning line in my head. (Probably 40KB / ~4KB/s throughput on a good dial-up connection = 10s transfer time, about the longest you'd want to wait for even a pretty spiffy page to load.)
1 reply →
> and now we have to deal with source maps and shit
Yeah minification is only really for obfuscation. The small and unpredictable difference is absolutely not worth the ridiculous complex "solution" of source maps. Just the fact that your debugger really doesn't work right, is a deal breaker in and itself, not to mention all the time spent configuring and fighting with webpack.
I don't think any form of "compilation" i.e. bundling, transpiling, minification etc is needed at all. Javascript can already dynamically load (additional) code files when needed, I don't understand why you need to bundle it in the first place.
I don't buy that the http request overheads are so big that it motivates all this complexity, and in the average case a user don't use every single page of the application anyway, so by bundling everything you are always serving "too much", compared to just dynamically loading additional code.
Gzipped JS is generally much smaller than minified JS, but minified-then-gzipped JS is even more so. The minification (assuming gzip) doesn't make a much difference in this case only because the input file is not that large at all and compression algorithms have a natural bias for larger inputs. You can (rightly) claim it is bad to have a JS file large enough that the minification makes a difference after all, but you'd be moving a goalpost then.
True, but it also removes the comments and the whitespace, leading to slightly better performance and memory usage. There are also less bytes to gzip on the server side.
2 replies →
A discoverable version would be to include source maps that link to the original as well. That way a browser console will automatically pull up the original.
People measure minification in byte size (unfortunately I guess you're charged by CDNs by that metric too?). In reality everything text based compresses really well over the wire. In either case, importing tons of libs left and right is going to vastly out-size any minification, yet most fe devs are very liberal with dependencies.
Minification strips comments too though, which may be undesirable in many cases.
That's simply not a very well followed (and thus discoverable) standard. Especially for hand crafted code, minifying functions and variable names only obfuscates what is written and minifying whitespace often only has minimal benifits.
In practice this seems to be a lost cause, and links to alternatively hosted source code is more common. Sadly this makes is simple to introduce subtle, harmful differences between the source and what is hosted.
The pattern is extremely common on CDNs that serve JS.
It's hard to guess that extra assets exist on the server if they aren't being pulled down by the site itself.
Seems better just to have premassaged source available in a repo somewhere, or called out on the page itself for a downloaded archive.
> He does it "the right way™". Use the platform. Don't use any framework or generic library.
Hard disagree. "Use What's Right For You™".
Of course there is value in understanding the platform beneath your framework or generic library, but that's just an extension of "understand what you're using and why".
We need a ciechanow.ski explainer for how ciechanow.sky explainers are built
Where are the comments in his code? :-)
He made this in the spirit of watch making. Super impressive and interesting website!
> Don't minify or bundle anything
Yeah in this case it doesn't need to; there's no extraneous or unused code or documentation blocks, and gzip (and comparable) compression is good enough, minification doesn't actually reduce the downloaded code size by that much.
the obvious downside is that it's a lot of work and takes a lot more time... so it might be "the right way™" for some cases, but it's definitely not a rule of thumb...
The tradeoff is that there is basically nobody else that has the expertise or time to do the same thing at a similar level of polish. We're not going to see more Ciechanowski-level posts unless new libraries and frameworks make it more accessible.
We definitely won't if people are taught that frameworks are the only option and never allowed to just write a full program on their own.
Nobody stopping you from not using a framework, and yet there is basically nobody else at Ciechanowski's level. It's not going to happen, you can't expect everyone to become a hardcore webgl expert (have you tried?). If we want more cool interactive visualizations, we have to make it easier. Otherwise, we're stuck waiting for those with the time and expertise to pull it off.
Maybe, maybe not. We should do the experiment, though.
What experiment do you mean?