Suppose you have a machine whose basic operations are integer addition, subtraction and multiplication, and greater-than-or-equal testing. Theorem: you can’t determine whether a number n is odd faster than some constant times log(n).
Proof: suppose you can; then, in particular, for large enough k your program should be able to determine for any number up to (let’s say) 210k whether the number is odd or even, and do something different in each case, using at most k steps. I’ll show that it can’t.
Well, what can the program have done within k steps? It’s done at most k comparisons so there are at most 2k code paths. Each comparison is testing the sign of some polynomial in n; what polynomial depends on the results of previous comparisons. In any case, the path the program has taken up to this point can depend only on the values of at most 2k polynomials in n. Oh, and they all have degree at most 2k.
Well, all these polynomials collectively have at most 22k roots. We can therefore find an integer m no bigger than 22k+1 such that none of those roots lies between m and m+1 inclusive. (Each root excludes at most two choices of m.) But then our program must produce the same results for m and m+1 since all the comparisons it does produce the same answer for both; since m and m+1 have different parity, it therefore can’t be doing the right thing for both of them.
Knuth’s book contains many clever algorithms. From the cursory reading I’ve given it so far, its contents don’t seem to be “papers on the design of algorithms” any more than the works of Shakespeare constitute “papers on the writing of plays”. So the title’s a bit misleading, which is a pity (it would be very interesting indeed to read Knuth’s thoughts on how to design algorithms) but not a surprise. The book is of course very good anyway.
(There are no comments on this entry yet.)