Polynomial time refers to a certain length of

time that

algorithms may need to do something. The running time of

computer algorithms is dependent on how big the problem is, and the size of problems is always

changeable. For instance, if you're

computing something using a list of

numbers, the time to run the algorithm can increase dramatically if you add more numbers to the

list, or it may not increase much at all. Because of the

fickle nature of computing problems, it is useful to determine the

running time of algorithms using a

variable to represent the size of the problem.

So, if there're n elements in our list of numbers, then an algorithm that runs in polynomial time would have its running time expressed as something like Cn, where C is the number of steps it takes to process each node. The running time is stil within the bounds of polynomial time as long as the formula fits the form cn^{x}(x-1) + ... + cn^{0}, where c and x are constants. This is more easily expressed as O(n^{x}). (See Big-oh notation)

An example of a non-polynomial time algorithm would have a running time like O(n!). (See factorial)