This marvellous expository text describes the authors' answer to
Exercise 1.2.6.63 in a book by Knuth [The art of computer
programming. Vol. 1, Addison-Wesley Publishing Co., Reading,
Mass.-London-Don Mills, Ont, 1969; MR 44 #3530]. Remarkably, in this
edition of Knuth's book the level of difficulty for this exercise,
i.e., to "develop computer programs for simplifying sums that involve
binomial coefficients", was rated as "50", exactly the same rating as
assigned to Fermat's last theorem [op. cit. ("Notes on the exercises",
Ex. 4)]. However, in contrast to the case of Fermat, a good portion of
these new and immensely useful summation methods---once found---are
elementary enough to be taught in courses like Stanford's "Concrete
Mathematics" [see R. L. Graham, D. E. Knuth and O. Patashnik, Concrete
mathematics, Second edition, Addison-Wesley, Reading, MA, 1994; MR
97d:68003].
The following paragraph, taken from Knuth's foreword, describes
informally what the book is about. "Science is what we understand well
enough to explain to a computer. Art is everything else we do. During
the past several years an important part of mathematics has been
transformed from an Art to a Science: No longer do we need to get a
brilliant insight in order to evaluate sums of binomial coefficients,
and many similar formulas that arise frequently in practice; we can
now follow a mechanical procedure and discover the answers quite
systematically." Maple and Mathematica programs that implement the
algorithms described in the book can be found on the WorldWideWeb or a
separately available companion diskette.
Generally, the algorithmic problem of symbolic summation can be viewed
as the discrete analogue of the well-known problem of "symbolic
integration". As with integration, symbolic summation finds a wide
range of applications in pure mathematics, computer science and
physics. Lots of examples are carefully discussed in this book, from
elementary combinatorial identities as, e.g., the binomial theorem, to
number-theoretic results as, e.g., the celebrated Rogers-Ramanujan
identities.
The underlying mathematical theory of the methods presented is based
on contributions from several researchers. In 1945, Sister M. Celine
Fasenmyer ["Some generalized hypergeometric polynomials",
Ph.D. Thesis, Univ. Michigan, Ann Arbor, MI, 1946] showed how one can
find recurrence relations that are satisfied by sums of hypergeometric
terms, in a purely mechanical, i.e., algorithmic, way; this work was
also described by E. D. Rainville [Special functions, Macmillan, New
York, 1960; MR 21 #6447 (Chapters 14 and 18)]. Zeilberger recognized
that Sister Celine's technique would also be a basis for proving
combinatorial identities; in the case of univariate definite
hypergeometric summation, he observed that a drastic algorithmic
speed-up can be achieved by using Gosper's decision procedure
[R. W. Gosper, Jr., Proc. Nat. Acad. Sci. U.S.A. 75 (1978), no. 1,
40--42; MR 58 #5497], which originally was designed for indefinite
hypergeometric summation. Also based on Gosper's algorithm, Wilf and
Zeilberger developed a theory of "WZ-pairs" that allows extremely
elegant certification of the truth of a certain class of combinatorial
identities (the proof certificate contains just a single rational
function which can be used for independent verification involving
rational function arithmetic only); in addition, this method can be
used for automatic generation of new identities from old
ones. Petkovsek developed an algorithm for deciding whether a given
linear recurrence with polynomial coefficients has a simple solution
as a linear combination of hypergeometric terms; this, together with
the algorithms above, gives a decision procedure for closed form
evaluation of definite hypergeometric summation.
All these building blocks are brilliantly discussed in the text which
is written in a spirit that puts strong emphasis on tutorial rather
than on research exposition aspects---a wise choice in view of the
novelty and broad applicability of the material. However, the
reviewer feels the need to say a few words about what is not in the
text. A paper by Zeilberger [J. Comput. Appl. Math. 32 (1990), no. 3,
321--368; MR 92b:33014] can be seen as the impetus for most of these
recent developments. This approach, besides providing a general
mathematical framework for the proof machinery presented, leads to
mathematically and algorithmically challenging questions such as
elimination in noncommutative operator algebras. In the text, general
holonomic theory [see, e.g., P. Cartier, Asterisque No. 206 (1992),
Exp. No. 746, 3, 41--91; MR 94a:33019], including important instances
like $q$-hypergeometric WZ-theory and elimination problems, are
touched upon only briefly. From a computational point of view, a more
detailed discussion of performance questions would be desirable.
Summarizing, the authors not only skillfully explained their methods
to a computer, they also did a truly outstanding job in explaining
these recent developments to a broad audience ranging from students to
researchers. In particular, this book is a must for all those who at
least once have struggled with a binomial sum.