← Back to context

Comment by generationP

1 year ago

Most of Buzzard's examples are indeed lossless (even bijective) conversions, but the notation mathematicians use (and programming languages often support) makes it look like they are completely absent, which comes back to bite you. In mathematics, it bites you when you want to actually compute your function and realize you don't remember how all those canonical arrows go. In programming, it bites you when the default conversions the compiler inserts aren't the ones you meant.