Context: statistics, probability, dependence

To make things simpler, we shall first start off with two random variables. Later we can generalise this to n random variables.

Suppose you have two random variables, X and Y, which has the distribution function, F(x) and G(y), and has a joint distribution function H(x, y). Now by Sklar's Theorem, there exists a function C(a, b) (which satisfies the property of being a joint distribution function for two separate) random variables taking values in [0, 1]) such that

H(x, y) = C(F(x), G(y))

and the function C is called a copula. Its use is to couple two marginal distributions together to form a joint distribution.

Of course, when you replace the random variables X and Y with a random vector x, you get a n-dimensional form of Sklar's Theorem.

There is a more formal form of the definition of what a copula is, but the way described here should be sufficient for most practitioners.

Copulas are especially useful in studying dependence between random variables, something in which statisticians are always interested.

Cop"u*la (?), n. [L., bond, band. See Couple.]

1. Logic & Gram.

The word which unites the subject and predicate.

2. Mus.

The stop which connects the manuals, or the manuals with the pedals; -- called also coupler.

 

© Webster 1913.

Log in or register to write something here or to contact authors.