The ranking system used by Google search engine. It was invented by
Sergei Brin and Larry Page of Stanford University in 1998. This system
models a random surfer on the Web. This random surfer starts in a random
page, takes one URL from it and follows that link, that is the basic idea.

If you have some background on probabilities, you can think of this as
a Markovian process in which the states
are pages, and the transitions are all equally probable and are the links
between pages. As you can also see if "Markovian process" tells you something,
that if a page has no links to another pages, it becames a sink and therefore
makes this whole think unusable, because the sink pages will trap the
random visitors forever.

However, the solution is quite simple. If the random surfer arrives to
a sink page, it picks another URL at random and continues surfing again.
To be fair with pages that are not sinks, these random transitions are
added to all nodes in the Web, with a residual probability of usually
*q=0.15*.

So, the equation is as follows:

*
Pagerank(i) = (q/N) + (1-q) Sum(j={pages that point to i}; Pagerank(j))
*

It's worth noticing, and that's why the Pagerank is so appealing in terms
of elegance (which is, of course, the whole point of doing Mathematics),
that the Pagerank values are the eigenvalues of the modified
adjacency matrix.

This algorithm really rocks, because it's fast to calculate (only
a few iterations are needed) and in practice it gives good results. The
main disadvantage is that it favors older pages, because a new page,
even a very good one, will not have too many links. That's why it has
to be combined with textual analysis or other ranking methods.