A root-finding method similar to the Newton-Raphson method, which is better in some cases (especially when f'(x) doesn't have a "nice" form, and you don't get the sort of nice cancellations like in the Babylonian square root algorithm).

At each stage, the last 2 values x_{n-1}, x_{n} are preserved; the straight line on the graph of f connecting the points (x_{n-1},f(x_{n-1})) and (x_{n},f(x_{n})) is drawn, and its intersection with the X axis is x_{n+1}.

Symbolically, it comes out like this:

x_{n+1} = x_{n} - (x_{n}-x_{n-1})*f(x_{n}) / (f(x_{n})-f(x_{n-1}))

(but it's much easier just to work it out for some specific f).

As you can see, you don't need to work out f'. Nonetheless, your function should be "nice" for any of the iteration methods to work, so this doesn't really help in terms of approximating roots for "hard" functions. What *does* help is that you only need to calculate the value of one function at each iteration, so it can a faster method.

In the "nice" cases, the "regula falsi" method increases the number of accurate digits in x_{n} as the n'th Fibonacci number; it can be seen as multiplying the number of accurate digits by the Golden ratio (1.618...) at every iteration. But since less work per iteration is often needed, this method is often superior to the Newton-Raphson method (which doubles the number of accurate digits). And, since symbolic knowledge of f' isn't necessary, it can be a better choice in software engineering terms when writing a root-finding subroutine.

Which you pick is usually a matter of convenience with the precise form of the iteration formula; however, the Newton-Raphson method also works in many (seemingly unrelated) cases, so both should be known!