His Youtube videos are gold. This one, in which he aims to take the imprecision of floating point numbers to extreme applications, such as training neural networks with linear activation functions or even implementing cryptologically-safe functions, is superb.
His Youtube videos are gold. This one, in which he aims to take the imprecision of floating point numbers to extreme applications, such as training neural networks with linear activation functions or even implementing cryptologically-safe functions, is superb.
This was harder to find than I would've thought, so for anyone else curious:
https://www.youtube.com/@tom7
https://www.youtube.com/watch?v=Ae9EKCyI1xU
I was.