Comment by ThePhysicist

4 years ago

I was curious how he did those visualizations so I looked at the source code. Turns out he codes everything by hand in WebGL [1]. Absolutely impressive stuff. Source code is non-minified so you can have a look and understand everything as well.

[1]: https://ciechanow.ski/js/watch.js

I'm observing that developers these days are quite surprised to see anyone write code for OpenGL / WebGL directly instead of using some layer of abstraction on top, such as Three.js or Unity etc. Few seem to know that OpenGL already is an abstraction of the computing model underneath.

A couple years ago I did some consulting for a company that needed a point cloud rendering engine. Luckily I had one ready to go. I showed them and they liked it and their young devs asked which library I was using. When I told them I used OpenGL they couldn't believe it. To them OpenGL was the "black magic box" and using it akin to having secret conversations with the GPU in some arcane cryptic language.

  • I wrote interactive 3D graphics programs in SGI IrisGL, the proprietary predecessor of OpenGL. At that time I considered it super easy because it was so high-level. Even so, as an experiment, I networked an Iris to a Lisp machine so I could write graphics code in Lisp and let the SGI machine render the output. Good times.

    History tidbit: Jim Clark, the founder of SGI, invented the GPU which was what made SGI machines so fast at 3D.. Later he went on to found Netscape.

    https://en.m.wikipedia.org/wiki/James_H._Clark

    • > History tidbit: Jim Clark, the founder of SGI, invented the GPU which was what made SGI machines so fast at 3D.

      Even more trivial: When I moved to Silicon Valley I bought a used Porsche on a whim (my recruiter implied it was a proper consultant accessory in SV) and fixed it up a bit until it became my daily driver. When I researched the title I discovered Jim Clark was the original owner. So I’d like to credit Jim Clark for starting me on the road to speed up my commute.

      2 replies →

  • When I was in college, I got really deep into OpenGL and worked through old textbooks about OpenGL 1.1. I think my favorite one was just called "The Red Book"? It was so much fun.

    After finishing that I found out that they were already several major versions ahead and had fancy things like shaders... it really is an amazing tool.

    Recently I started messing with some Three.js stuff and it does have some nice abstractions... main benefit of it for me is the ecosystem around it. Being able to just plug in some physics and interactivity and not have to deal with digging up old screen-to-world conversion code is nice.

  • In my waking up state I read that as “some layer of distraction”. How fitting ;) But back to original post , yes his work and website is amazing.

    • When it’s time to simplify and/or troubleshoot overly-complex things, I like to use the phrase “The abstraction is a distraction”.

He does it "the right way™". Use the platform. Don't use any framework or generic library. Go straight to the point and code what you need, when you need it. Don't minify or bundle anything, and let the people who are learning and courious a straightforward way to connect the dots, without forcing them into a github repository with 90% of the code unrelated to the thing and existing just to glue 1000 pieces written by 10000 people together. Every essay by Bartosz is so top-notch and a such breath of fresh air! He gives me hope in humanity and I am immensely grateful for what he does.

  • I strongly disagree that this is "the right way". I think that the platform provides low level primitives that are _designed_ to have abstractions built upon them.

    Doing it like this has the potential to be the most performant, but it does so in the same way as writing your programs directly in assembly is potentially performant.

    I also don't think that the source code is particularly readable for me, and contains lots of magic numbers and very imperative code. I would personally find it a lot more readable if it was written in some sort of declarative way using a library, even if I have to look at a GitHub repo instead of view source.

    • > but it does so in the same way as writing your programs directly in assembly

      > contains lots of magic numbers and very imperative code

      Well, we really don't know if the code was written in this form by hand, don't we.

      It could have been compiled into this, to use your words, "assembly with magic numbers and imperative" from much more elegant form. We may see this form only because this is what browsers understand.

      I am not saying it was compiled, just speculating that seeing pure WebGL does not mean it was pure WebGL to begin with.

      2 replies →

    • When there's physics, graphics and mathematics, there are magic numbers, which are results of formulas which needs to be computed once, or material properties, or nature's constants.

      Also, nature and graphics works as an imperative parallel machine. So the code mirrors that.

      This is not written deliberately this way. Code comes out like that when you strip all the libraries, fluff, and other less-related stuff.

      I also write a scientific application, and yes, This is the way.

    • > has the potential to be the most performant

      It also has the potential to evolve in the most efficient way.

  • I mostly agree with you, but I don’t mind minification when appropriate, as it can serve a functional purpose with tangible end-user-friendly benefits (less downloaded over the network = faster response times).

    But if you want to be friendly to the tinkerers, you could always host both the *.js and *.min.js versions, and have the webpage just pull the latter - anyone who wants the unminified source can remove the “min” part from the URI, while the majority of end users will still benefit from pulling the minified js.

    • minified js is not greatly smaller than gzipped js, I think the whole minification thing is a swing and a miss and now we have to deal with source maps and shit, and build pipelines and etc

          $ ls -la
          -rw-r--r-- 1 jack 197609 330905 May  4 22:56 watch.js
          -rw-r--r-- 1 jack 197609 152172 May  4 22:55 watch.min.js
      
          $ gzip watch.js
          $ gzip watch.min.js
          $ ls -la
          -rw-r--r-- 1 jack 197609 43690 May  4 22:56 watch.js.gz
          -rw-r--r-- 1 jack 197609 32507 May  4 22:55 watch.min.js.gz

      7 replies →

    • A discoverable version would be to include source maps that link to the original as well. That way a browser console will automatically pull up the original.

    • People measure minification in byte size (unfortunately I guess you're charged by CDNs by that metric too?). In reality everything text based compresses really well over the wire. In either case, importing tons of libs left and right is going to vastly out-size any minification, yet most fe devs are very liberal with dependencies.

      Minification strips comments too though, which may be undesirable in many cases.

    • That's simply not a very well followed (and thus discoverable) standard. Especially for hand crafted code, minifying functions and variable names only obfuscates what is written and minifying whitespace often only has minimal benifits.

      In practice this seems to be a lost cause, and links to alternatively hosted source code is more common. Sadly this makes is simple to introduce subtle, harmful differences between the source and what is hosted.

      1 reply →

    • It's hard to guess that extra assets exist on the server if they aren't being pulled down by the site itself.

      Seems better just to have premassaged source available in a repo somewhere, or called out on the page itself for a downloaded archive.

  • > He does it "the right way™". Use the platform. Don't use any framework or generic library.

    Hard disagree. "Use What's Right For You™".

    Of course there is value in understanding the platform beneath your framework or generic library, but that's just an extension of "understand what you're using and why".

  • He made this in the spirit of watch making. Super impressive and interesting website!

  • > Don't minify or bundle anything

    Yeah in this case it doesn't need to; there's no extraneous or unused code or documentation blocks, and gzip (and comparable) compression is good enough, minification doesn't actually reduce the downloaded code size by that much.

  • the obvious downside is that it's a lot of work and takes a lot more time... so it might be "the right way™" for some cases, but it's definitely not a rule of thumb...

  • The tradeoff is that there is basically nobody else that has the expertise or time to do the same thing at a similar level of polish. We're not going to see more Ciechanowski-level posts unless new libraries and frameworks make it more accessible.

Can you point to what libraries he could have used that would have made it simpler? I doubt anything like would benefit from any type of abstraction that currently exists, unless it was a more interactive application that would incorporate user input etc.

  • Depending on one's skillset, you could use a dcc tool like Blender + three.js to make creation of these visuals and interactions much simpler. Have a look at gltfjsx + react-three-fiber [1] combination, which themselves are abstractions over vanilla three.js.

    With that said, the raw webGL approach here is arguably more educational, so goal achieved I think!

    [1] https://docs.pmnd.rs/react-three-fiber/getting-started/examp...

    Edit: there's actually a 50 LOC watch example with r3f: https://codesandbox.io/s/bouncy-watch-qyz5r

    • Cool example, but all r3f is doing here is just providing the threejs camera, controls and the text with emoji, the watch itself is loaded as a .glb file, where I'd assume most people would be interested in learning about.

      1 reply →

  • Three.js maybe, but it doesn't abstract too much away in my opinion, it has a lot of functionality around more complex topics (textures for example), but since he doesn't seem to use those it's probably not worth the hassle.

Are you sure about the by hand part? There's a lot of repetition, it feels like at least some of it must be generated.

  • I wish he'd write a post about how he developed these visualizations. How does one even learn how to make something this amazing?

    • > How does one even learn how to make something this amazing?

      I haven't done anything quite this amazing, but I have created other things with minimal upfront knowledge and "the way" is simple: just jump in and give it your best shot with what you already know, identify the most glaring deficiency in what you made, take your best shot at solving that, and repeat that process until you have something cool. You can also use this process to focus what you spend time studying/learning, as you backfill the information you were missing to figure out how to overcome whatever obstacles you encounter.

      It does take time, but you know what they say about long journeys and single steps. Sometimes there are no shortcuts and you just have to take a lot of steps.

Apart from going to each post and manually looking at the JS codes, is it possible to get them all in one go? https://ciechanow.ski/js/ returns 403 error.

  • wget will do what you want, with the right flags. Try `wget -r https://ciechanow.ski/mechanical-watch/ --include-directories=js/`, the resulting `ciechanow.ski/js/` dir should have it.

    Adjust the flags as necessary to crawl more of the site if needed (omitting `--include-directories` without an `-l {limit}` flag will eventually crawl the whole site, please be kinder to their bandwidth than that).

I like that; it's a lot of work but a lot of people seem to prefer to have to make libraries work together than to just do the work, and it's timeless since it doesn't depend on any future frameworks; any issues that might come up in the future with regards to browser incompatibility can be fixed relatively easily.

Would antifragile be an applicable word to use here?

  • I have a WebGL project thats been broken for a few years due to a browser deprecated api, it is not a relatively easy fix

Frameworks change and get in the way when a new version is released. It is most annoying. The underlying WebGL changes much more slowly, and in a much more controlled way with a focus on backward compatibility. So I do the same whenever I can and ignore frameworks to get rid of a dependency. The boilerplate overhead can be encapsulated very well into homemade JS functions that only change when I change them. And JS + WebGL is not really low-level.

" Turns out he codes everything by hand in WebGL"

You really have to admire people who do stuff like that (I can't imagine that I would ever have the patience to do that).

What I'm mildly curious about is why would anyone want to do it? Is there a demand for such stuff? I can understand it if the exercise was for training people but wouldn't most people who were interested in the internal workings of watches already be familiar with them?

I'd reckon most would be like me in that they'd pulled enough watches apart in their younger years to already know their ins and outs (I've long lost track of the number watches and clocks I've either fixed or disassembled by the time I was a teenager).

  • There's little benefit to writing your own asm these days[1], yet we need people who know asm intimately to write compilers.

    It's the same here. Without people who deeply understand a tool's input and output, we won't ever write a better tool.

    [1] don't @ me, cryptographers and kernel programmers.

    • "There's little benefit to writing your own asm these days"

      Agreed, but programming in ASM makes one think differently. In my opinion every programmer should do some elementary Assembler as part of their training, say some basic projects based on 8086 stuff or even the simpler 8085/Z80 etc. It's a long while since I've done any serious ASM work but the techniques one learns and the ideas one develops frame important attitudes that can't be easily gathered from high level stuff (one gains a better understanding of the underlying hardware and such).

      You're right, we still need people for this work, it'd be pretty hard to optimize a compiler without them methinks.

      FYI, there's stuff I omitted to mention in the above comment, I mentioned it below in my reply to bitcurious. It's interesting 'fitness for purpose' arises given your ASM comment.

  • > I can understand it if the exercise was for training people but wouldn't most people who were interested in the internal workings of watches already be familiar with them?

    Most young people don’t even have access to a mechanical watch these days.

    • I fully accept your point that 'most young people don’t even have access to a mechanical watch these days' but I'd contend that that ought not be relevant (as I'll explain in a moment).

      First, I should say that I was perhaps a little harsh in my above comment and I didn't convey what I actually wanted to say. Unfortunately, I was distracted by the somewhat irritating fact that I couldn't view any of those drawings on two different smartphones (I could read the article but only saw white spaces where the drawings were supposed to be). It was only later when logged onto my PC that I could view them, and even then the whole page was sluggish and a pain to view. Perhaps those with faster equipment and or faster internet connection had better luck.

      Thus, the thinly-veiled message behind my comment was a question about fitness for purpose—both about the subject matter and its method of delivery. My cynical inference was that if people today didn't already understand the basics of gearing, mechanical advantage and escapement mechanisms etc. then something has gone wrong with the education system in that the basics of how clocks worked were well covered in physics (in mechanics) during my first year in high school—and that should not have changed as we still live and move around in a mechanical world—and thousands of industrial mechanical devices use these principles and rely on people having a good working knowledge of them.

      By third year physics, everything relevant about the essential workings of a clock would also have been covered at a basic mathematical level: Hooke's Law, gears and gear ratios, friction and simple harmonic motion. Even temperature expansion and contraction and the need to compensate for them was discussed (and in this context the physics teacher even discussed Harrison's famous chronometer and how his design included temperature compensation to ensure that changes in these parameters did not impact heavily on time drift).

      In essence, by third year, everything of important in that article had been covered in the school curriculum. If articles such as that under discussion are now being used to compensate for the lack of training in high school science then we have a serious problem with the education system.

      Incidentally, even if one's not interested in clock mechanisms, it's worth having a look at this 19th Century publication on mechanical movements that's on the Internet Archive (there are more types around than most people have ever likely considered: https://archive.org/details/Mechanical_Movements_507

      2 replies →

Absolutely the best use of web tech I have seen. The best way to put it, is magic.

Great great great work!!!