← Back to context

Comment by antonvs

3 months ago

> If we think of parsing and type-checking as a unit together

This certainly leads to a blurring of the distinction, but that's a result of the choice of this as a premise.

Parsing will give you an AST that tells you that a term has a type, say, `Int -> Bool`, which might be represented e.g. as a 3-node tree with the arrow at the root and the input and output types as leaves. But falling back to the conceptual definition of syntax, this tells you nothing about what that tree means.

To add type checking into the picture, we need another stage between 2 and 3, which is where meaning is assigned to types in the AST and the semantics of the types are handled.

You'll often see people saying things about how types are syntactic, but this is a slightly different use of the word: basically, it refers to how types categorize syntactic terms. So types apply to syntax, but their behavior when it comes to actual checking is still semantics - it involves applying logic, inference etc. that go well beyond syntactic analysis in the parsing sense.

Really, it boils down to what you choose as your definitions. If you define syntax to be, essentially, the kind of thing that's modeled by an AST, then there's not really any ambiguity, which makes it a good definition, I think. Semantics is then assigning meaning to the nodes and relationships in such a tree.

Re Chomsky, I think the discovery that PL semantics can be entirely implemented in terms of a series of syntactic transformation is quite relevant to what Chomsky was getting at. But that doesn't negate semantics, it just gives us a more precise model for semantics. In fact, I have a sneaking suspicion that this formally-supported view might help clarify Chomsky's work, but a bit more work would be needed to demonstrate that. :)