This is an extremely good algorithm for estimating the square root of a number. Amazingly, it was known to the ancient Babylonians, even though it is a special case of what is today called Newton's method!
Substituting f(x)=x2-a in the formulae in Newton-Raphson method, we get this iteration (for approximating sqrt(a)):
xn+1 = (xn + a/xn)/2
which makes some sort of odd sense (in multiplicative
is "as close" to sqrt(a) as a/xn
, so why not take their (additive
The starting value x0 can be anything you like; 1 or a/4... or any better guess you might have for the root.
In a practical floating point implementation, a can always be taken in [1,4) (since the square root of the even part of the exponent is known exactly), and only a finite precision is needed. Thus, we may take x1=1, and calculate the exact number of iterations required (saving ourselves the overhead of executing a loop). The main problem is ensuring roundoff error doesn't cause some slight misconvergence.