It isn't clear to me why these are so different? The alternative mindset to deterministic software is to use probabilistic models. The common mentality is that deterministic software takes developer knowledge into account. This becomes less effective in the big data era, that's the bitter lesson, and that's why the shift is taking place. I'm not saying that the article is the bitter lesson restated, but I'm saying that this is a realisation of that lesson.
Ah ok yes that makes more sense to me. Thank you for clarifying - I agree this new philosophy of building on probabilistic software is an outcome of the bitter lesson.
And we will over time have even more capability in the model (that is more general purpose) than our deterministic scaffolds...
Author advocates for building general purpose systems that can accomplish goals within some causal boundary given relevant constraints, versus highly deterministic logical flows that are created from priors like intuition or user research.
Parallel with the bitter lesson being that general purpose algorithms that use search and learning leveraged by increasing computational capacity tend to beat out specialized methods that exploit intuition about how cognitive processes work.
No it isn't...the author is talking about products and building with a completely different mindset to deterministic software.
The bitter lesson is about model level performance improvements and the futility of scaffolding in the face of search and scaling.
It isn't clear to me why these are so different? The alternative mindset to deterministic software is to use probabilistic models. The common mentality is that deterministic software takes developer knowledge into account. This becomes less effective in the big data era, that's the bitter lesson, and that's why the shift is taking place. I'm not saying that the article is the bitter lesson restated, but I'm saying that this is a realisation of that lesson.
Ah ok yes that makes more sense to me. Thank you for clarifying - I agree this new philosophy of building on probabilistic software is an outcome of the bitter lesson.
And we will over time have even more capability in the model (that is more general purpose) than our deterministic scaffolds...
2 replies →
How so?
Author advocates for building general purpose systems that can accomplish goals within some causal boundary given relevant constraints, versus highly deterministic logical flows that are created from priors like intuition or user research.
Parallel with the bitter lesson being that general purpose algorithms that use search and learning leveraged by increasing computational capacity tend to beat out specialized methods that exploit intuition about how cognitive processes work.