← Back to context

Comment by skydhash

5 days ago

> What is utterly lacking is a formal notation for these computation-reasoning steps that is not tied to a specific programming language

A formal notation for this is a programming language. Which is why everyone use pseudo code or something equivalent for a first pass thinking or for communication.

Not quite. It is in a way (very visible when we talk about invariants), but it is not, well, semantic. A good notation for those steps would allow me to make only the distinctions I care for and ignore the distinctions I do not care for. So, for example, if I want to describe the greatest common divisor algorithm, I would only focus on the arithmetic part and define the domain of the parameters as "positive integers" (or maybe "non-negative integers").

A real programming language has a different goal: the code must run and do something useful. So I am unlikely to be able to define the domain like that (won't be able to make important distinctions) and, at the same time, will likely have to attend to things I do not care about (will have to make unimportant distinctions). All this will cripple and obscure the logic. This is what happens now and as a result we have a different implementation of everything for every programming language.