Algebraic models of real computation and their induced notions of time complexity neglect stability issues of numerical algorithms. Recursive Analysis on the other hand appropriately describes stable numerical computations while, based on Turing Machines, usually lacks significant lower complexity bounds.
We propose a synthesis of the two models, namely a restriction of algebraic algorithms to computable primitives, These are thus inherently stable and allow for nontrivial complexity considerations. In this model. one can prove on a sound mathematical foundation the empirically well-known observation that stability and speed may be contradictory goals in algorithm design.
More precisely we show that solving the geometric point location problem among hyperplanes by means of a total computable decision tree (i.e., one behaving numerically stable for all possible input points) has in general complexity exponentially larger than when permitting the tree to be partial, that is, to diverge (behave in an instable way) on a 'small' set of arguments.
The trade-off between the extremes is investigated quantitatively for the planar case, Proofs involve both topological and combinatorial arguments. (c) 2005 Elsevier B.V. All rights reserved.