The name given to a large family of optimization algorithms, invented by R. E. Bellman. When searching a very large solution space, it is sometimes possible to transform exponential running time into polynomial running time by caching previous results which are optimal for a smaller instance of the problem.

For instance, searching a graph with edges labelled by their lengths for a shortest path between two vertices can use this technique. Define the k'th distance matrix M_{k} to have in row i, column j the length of the shortest path from vertex i to vertex j taking not more than k steps (if it is not possible to get from i to j in k steps, the value is infinity).

Then M_{1} is simply the graph's adjacency matrix. Given M_{k} we can readily calculate M_{k+1}: for each vertex l adjacent to j, M_{k} gives the length L of the shortest path from i to l taking k steps, so we can get from i to j in k+1 steps with distance L+d(l,j), where d(l,j) is the length of the edge from l to j. Minimise this over all l's to get entry <i,j> of M_{k+1}}; do this for all i and j to get M_{k+1}. The distance from some vertex A to another vertex B is simply entry <A,B> of M_{n}, where n is the number of vertices in the graph.

Computer science makes up new names for everything. A huge number of special cases of the fundamental dynamic programming algorithms have been "rediscovered" over the years. Almost all are now known after their rediscoverers. For instance, edit distance, the Needleman-Wunsch algorithm, the Smith-Waterman algorithm, and the Viterbi algorithm were all known to Bellman. He just invented too much actually to get the *credit* he deserved.