I'm working on a higher-order theorem prover, of which unification seems to be the most difficult subproblem.
If Huet's algorithm is still considered state-of-the-art, does anyone have any links to explanations of it that are written to be understood by a programmer rather than a mathematician?
Or even any examples of where it works and the usual first-order algorithm doesn't?
State of the art — yes, so far as I know all algorithms more or less take the same shape as Huet's (I follow theory of logic programming, although my expertise is tangential) provided you need full higher-order matching: subproblems such as higher-order matching (unification where one term is closed), and Dale Miller's pattern calculus, are decidable.
Note that Huet's algorithm is the best in the following sense — it is like a semi-decision algorithm, in that it will find the unifiers if they exist, but it is not guaranteed to terminate if they do not. Since we know that higher-order unification (indeed, second-order unification) is undecidable, you can't do better than that.
Explanations: The first four chapters of Conal Elliott's PhD thesis, Extensions and Applications of Higher-Order Unification should fit the bill. That part weighs almost 80 pages, with some dense type theory, but its well motivated, and is the most readable account I've seen.
Examples: Huet's algorithm will come up with the goods for this example: [X(o), Y(succ(0))]; which of necessity will perplex a first-order unification algorithm.
An example of higher-order unification (really second-order matching) is: f 3 == 3 + 3
, where ==
is modulo alpha, beta, and eta-conversion (but not assigning any meaning to "+"). There are four solutions:
\ x -> x + x
\ x -> x + 3
\ x -> 3 + x
\ x -> 3 + 3
In contrast, first-order unification/matching would give no solution.
HOU is very handy when used with HOAS (higher-order abstract syntax), to encode languages with variable binding while avoiding the complexity of variable capture etc.
My first exposure to the topic was the paper "Proving and Applying Program Transformations Expressed with Second Order Patterns" by Gerard Huet and Bernard Lang. As I recall, this paper was "written to be understood by a programmer". And once you understand second-order matching, HOU isn't much further to go. A key result of Huet's is that the flexible/flexible case (variables as head of a term, and the only case not appearing in matching) is always solvable.
I would add to the reading list a chapter in volume 2 of
the Handbook of Automated Reasoning. This chapter is probably
more accessible to the beginner and ends with λΠ-calculus where
Conal Elliott paper starts.
A preprint is found here:
Higher-Order Unification and Matching
Gilles Dowek, 2001
http://www.lsv.fr/~dowek/Publi/unification.ps
Conal Elliott paper is more formal and concerntrated on one variant,
and also introduces a λΠΣ-calculus in the end, which has also sum-types
besides product-types.
Bye
There is also Tobias Nipkow's 1993 paper Functional Unification of Higher-Order Patterns (only 11 pages, 4 of which are bibliography and code appendix). The abstract:
The complete development of a unification algorithm for so-called higher-order patterns, a subclass of $\lambda$-terms, is presented. The starting point is a formulation of unification by transformation, the result a directly executable functional program. In a final development step the result is adapted to $\lambda$-terms in de Bruijn's notation. The algorithms work for both simply typed and untyped terms.