The covariance matrix of an n-dimensional vector random variable X is the n×n matrix of the covariances between the elements of the vector. The covariance matrix is the generalization of the variance of a scalar random variable to multiple dimensions.

The covariance matrix is the measure of how spread out the probability distribution of X is in n-dimensional space. The 'larger' the elements of the covariance matrix, the more spread out X is.

The covariance matrix of X, often denoted Σ, is defined by the formula

Σ = E[(X-μ)(X-μ)T],
where X and μ are column vectors and μ = E[X] is the mean of X.

The element of the matrix at row i and column j is the covariance between the ith and jth elements of X. Specifically, the ithdiagonal element of Σ is the variance of the ith element of X.

The covariance matrix is always symmetric and positive semi-definite. These facts can be proven easily from the definition.

The covariance matrix may be singular. This can happen if the variable X does not have any variation along one or more dimensions.

More generally, the singularity of the covariance matrix implies that the distribution of X is flat along one or more orthogonal directions in n-dimensional space. The distribution is not really n-dimensional - it's less than n-dimensional. Specifically, the distribution lives in an affine subspace of the n-dimensional space. The rank of Σ gives the dimensionality of that affine subspace.

Log in or register to write something here or to contact authors.