An iteration method for converging on a root of a differentiable function. If the current point is some xn, here's how you get to the next point xn+1: take the point (xn,f(xn)) on the graph of the function, and extend the tangent to the graph at that point until it hits the X axis; that's your next point. So, if the graph of f(x) is a straight line, you'll reach the root in one iteration.

In symbols, you get this formula:

xn+1 = xn - f(xn)/f'(xn)

Obviously, we don't want f'(x) to become anywhere near where we're iterating, or we'll divide by 0 (or by a very small number, which can be worse). But under "good" conditions (continuous f'(x), with non-zero derivative at the root), the method converges when x1 is "close enough" to the root, and in fact doubles the number of accurate digits at every iteration! A similar situation exists for complex functions.

You can get similar iterations in the multivariate case; there the formula becomes the vector formula

xn+1 = xn - (f'(xn))-1 ⋅ f(xn)
Here f(xn) is a vector, and f'(xn) is a matrix; the condition in the scalar case that f'(x) be far from 0 is replaced by conditions that f'(x) be well conditioned, so that f'(x) can be inverted (and accuracy not lost).

However, analysing convergence in this case is very hard (but when it works, it works great!)

A nice example of non-convergence of the Newton-Raphson method is given by taking f(x)=cbrt(x) (cube root) and working out the iteration (it comes out as xn+1 = -2*xn, which diverges very rapidly).

The Newton-Raphson method requires you to know the derivative of the function. Sometimes this is inconvenient, imprecise (derivatives increase error), impractical (due to looking for the zero of a function determined empirically rather than analytically) or impossible (if you're writing a general root finding function, and your chosen API doesn't let you ask also for the derivative of the function). The closest method which doesn't use derivatives is the "regula falsi" method (false secant iteration).

A small point about the rate of convergence and the error. Lets say a is your true root, and you are at step n and your current guess is xn and after the iteration you are going to get xn+1. Now we can write
f(a)=f(xn) + (a-xn)f'(xn) + (a-xn)2*f''(xn)/2 + ...
Since f(a) is zero, we get dividing throughout by f'(xn):
-f(xn)/f'(xn) = (a-xn)+(a-xn)2 f''(xn)/f'(xn)
The term on the left is just xn+1 - xn. So we get
xn+1-xn = (a-xn)+(a-xn)2 f''(xn)/f'(xn) ...(1)
Okay thats the main formula and it gives us two interesting results. The first is obtained by cancelling xn from both sides. We get:
xn+1 - a = (a-xn)2 f''(xn)/f'(xn)
If we write the error at the nth step as en then we have
en+1 = en2*M
Where M is a bound for f''/f' on the interval of interest. This says that the Newton Raphson method is quadratically convergent.
The second interesting result is obtained by neglecting the quadratic term in formula (1) above. We get
xn+1 - xn = (a-xn ) = en
So an estimate for the error is just the difference between the points at consecutive iterations!

Log in or registerto write something here or to contact authors.