Comment by silentbicycle

15 years ago

No, it's unification and backtracking.

Unification is far more powerful than pattern matching, which (among other things) supports working with partial information - you can pass around a list of cons cells where the cdr is an unbound variable, and then bind it to another cons cell with an unbound cdr to get O(1) appending, for example. (These are called difference lists.) Same can be done with trees and other, more complex data structures. In effect, rather than fully immutable or fully mutable variables, you get variables that can only be set once, and then semantically "always were" that value. (but possibly undone on backtracking)

Backtracking means that pattern-matching a variable isn't just pass/fail, but potentially a generator (AKA "iterator", etc.) for all matching values. Sometimes (ok, often) the ensuing combinatorial explosion keeps it from scaling efficiently to real problems, but constraint programming compensates, pruning off a LOT of the space of potential solutions before searching. And, saying, "here's my problem, throw everything at it and figure it out" in very few lines of code is still great for prototyping.

Comparing Prolog to Erlang makes the difference clearest, to me. Erlang doesn't do backtracking (because it throws soft-real-time guarantees out the window!) or full unification (because passing around and subsequently binding unbound variables would be a form of non-local state). Erlang is still a great language IMHO, but it's a very different kind of language, because those two features are what make Prolog Prolog.