In computer science, specifically complexity theory, the capital Omega is used to represent a particular complexity class of functions. Specifically:

```f(n)=Ω(g(n)) iff there exists constants c>0 and N>0 such that
f(n) >= c*g(n)  for all n>N
```

That is, f(n) is "at least as big" as g(n), up to a constant factor, for sufficiently large values of n. (There are some slightly different definitions as well; some researchers require only that there there be infinitely many values for which the relationship holds, not all n>N, and so forth. The various versions are not equivalent, but they're similar enough for most purposes). Thus, Big-Omega is the opposite of Big Oh. The latter states that g(n) is at least as big as f(n) up to a constant factor. Between the two is Big-Theta, which basically means that the two functions can bound each other for sufficiently large values of n and suitably chosen constants.

There is also little-omega notation, which is analogous to little-oh notation: f(n) is ω(g(n)) iff the limit as n->infinity of g(n)/f(n) is zero.

Lower bounds are usually expressed in big-Omega notation, e.g. sorting by comparisons is provably Ω(n log n).