A Markov process {X1, X2, ...} on the vertices of a graph G=(V,E) which moves from a vertex to a neighbour. So the process is completely determined by the probabilities

P{Xn+1=v | Xn=u} = p(u,v)
for all {u,v}∈E: if we're at the vertex u, we pick one of the neighbours v for our next step with probability p(u,v) independently of all other steps and go there.

If we are equally likely to go to any of the neighbours, the walk is the particularly important simple random walk (which see).

Random walks have been studied very extensively. Here are some easy comments.

For a finite graph G, this is a typical finite Markov process. Assume {u,v}∈E.p(u,v)>0. Then the walk is aperiodic iff G is not a bipartite graph, and the walk is irreducible iff G is connected. So it's an ergodic process iff ξ(G)≠2 (G is not bipartite) and G is connected.

For an infinite graph G, the process is perhaps less familiar. Notions of transience and recurrence are all easy to understand; long-term behaviour is generally understood, although some surprises exist (and more are expected in the less understood regions).