← Back to context

Comment by thechao

3 years ago

Watching the AI community rediscover automatic differentiation 20+ years after the field was considered "mature" was equal parts frustrating & fascinating. The frustration was them rewriting the history of discovery, but without any sort of sense or rigor ... and it was also the most fascinating!

This is indeed the frustration

I'm waiting for some fresh group of grad students to make a breakthrough using a reinvented version of Pearls "Do" calculus or maybe they make some narrow breakthrough using BayesNets and everyone geeks out on those for a while

*I do think transformers (much like ff networks + backprop from 2012-2018) are probably a lasting software architecture for inference applications until we come up with new hardware, and move beyond GPU focused computing

It's exciting to see it all working, but disheartening how a-historical this last few years has been in AI - with the exception of Brooks, Sutton and a few other greybeards in the field who say similarly

  • Scarcity is not a myth.

    • There is no scarcity of fundamental human needs of water, food, shelter and love.

      The only reason someone lacks them is because someone else is hoarding them.

      This is well established in global trade metrics.

The funny thing is that this constantly happens in every field ever, humans truly excel at repeating history without learning from the past.

Another example:

- HTML served by static file servers

- HTML generated by backend

- HTML enhanced with small JS snippets

- HTML generated by frontend, but served by backend

- Go to step one, not learning why anyone moved on from the previous method

  • Poor training, poor communication, & knowledge not being curated.

    When then best method of getting advice on the internet is to post the wrong answer you know the system is broken.