Comment by Mikhail_Edoshin
2 days ago
Not quite. It is in a way (very visible when we talk about invariants), but it is not, well, semantic. A good notation for those steps would allow me to make only the distinctions I care for and ignore the distinctions I do not care for. So, for example, if I want to describe the greatest common divisor algorithm, I would only focus on the arithmetic part and define the domain of the parameters as "positive integers" (or maybe "non-negative integers").
A real programming language has a different goal: the code must run and do something useful. So I am unlikely to be able to define the domain like that (won't be able to make important distinctions) and, at the same time, will likely have to attend to things I do not care about (will have to make unimportant distinctions). All this will cripple and obscure the logic. This is what happens now and as a result we have a different implementation of everything for every programming language.
No comments yet
Contribute on Hacker News ↗