Divide and conquer algorithms typically have a time complexity recurrence relation of the form:

- T(1) = c
- T(n) = aT(n/b) + cn
^{k}

Such that a problem of size

*n* is split into

*a* subproblems, each of size

*n/b*. And

*cn*^{k} is the cost of combining the solutions.

The closed form solution for this recurrence depends on the relation between a, and b^{k}.

- if a > b
^{k} then T(n) = O(n^{logba})
- if a = b
^{k} then T(n) =
O(n^{k}log n)
- if a < b
^{k} then T(n) =
O(n^{k})

For example, merge sort is a typical divide and conquer alorithm, where the list to be sorted is split into 2 equal subproblems and it takes O(n) time to put the two sorted sublists back together.

Thus the recurrence relation will be:

T(n) = 2T(n/2) + n

Where a = 2, b = 2, k = 1

Applying the above theorem, merge sort has a time complexity of O(n log n).

Reference: Lecture notes from my Theory of Algorithms class