Comment by reaperman
1 year ago
> (i.e., it depends what order you go)
At the risk of utterly derailing this with irrelevant discussion: path-dependent systems are particularly tricky for some people IMHO. I think in a more state-based way, and my first rigorous dive into path-dependent calculation was during my chemical engineering degree -- I learned to be extremely vigilant about memorizing what was path-dependent and triple-checking if that affected the situation I was calculating.
I do wish there was more rigorous exposure to them at lower levels of education and younger age. Because while I'm perfectly capable of handling path-dependent systems with proper focus and effort, my brain doesn't feel "native" when deriving solutions around those spaces - it feels similar to being "fluent enough" in another language. I feel this way about a lot of things -- I really feel I'd have been happier and more fulfilled if I'd been immersed in super rigorous first-principles education beginning around age 8-9. I didn't do well with things like "memorize this procedure for doing long division" and did much better with conceptual derivations of physics/math/science/historical arcs, etc.
The problem is that no one thinks of type conversions as taking any explicit paths! It's one thing to view the actual process of long division as path-dependent (something everyone who learns about Gröbner bases is familiar with, as at the right level of generality even the result is path-dependent); it's another thing to apply the same intuition to the way the inputs are parsed. (You said divide 3 by 2 with remainder? Sure, the quotient is 3/2 and the remainder is 0. Problem?)
> The problem is that no one thinks of type conversions as taking any explicit paths!
Indeed. This is super surprising to me and I’m adding the topic to my “study” list. I had no idea until today - I easily could imagine it’s possible if some of the type conversions are “lossy” (e.g. maps to lists), but I have a strong feeling that simpified lossy conversions are not what is being referenced.
Most of Buzzard's examples are indeed lossless (even bijective) conversions, but the notation mathematicians use (and programming languages often support) makes it look like they are completely absent, which comes back to bite you. In mathematics, it bites you when you want to actually compute your function and realize you don't remember how all those canonical arrows go. In programming, it bites you when the default conversions the compiler inserts aren't the ones you meant.
Isn’t that the intuition behind homotopy type theory — that equivalence is a path between types?
(And related, eg, cubical type theory.)