(linear algebra, analysis:)
Let M be a square matrix. If for some nonzero vector v and some scalar m Mv = mv (mathematicians call this "M acting on v by multiplication", but all it means is that we take the matrix product of M and v) then m is called an eigenvalue (or "self value") of M.

The case is important, because it means that the linear transformation M merely acts as an expansion in the direction of v.

An eigenvalue for a matrix A is a scalar λ such that:

Ax = λx

As a simple example, consider the matrix A:

  [ 3  1 ]
  [ 0  2 ]

Manipulate the equation above to get:

det(AI) = 0

Take the determinant

  | 3-λ  1   |
  | 0    2-λ |

Giving:

det(AI) = (3-λ)(2-λ) = 0

Solving gives the eigenvalues 2 and 3.

Using the eigenvalues, now you can find the eigenvectors.

For an n by n matrix, there will always be n eigenvalues, but they may not be distinct. Additionally, the sum of all eigenvalues is equal to the trace of the matrix.

In a more general setting, let O be an operator, acting on a space T (usually a topological vector space, but anything with a scalar product will do). We say a scalar λ is an eigenvalue for O if there exists ψ ∈ T, such that Oψ = λψ. Here ψ is an eigen(vector/function/whatever you're calling elements of T) for O.

Where a determinant and trace are well-defined, it can be shown that det O is the product of the eigenvalues and tr O is the sum. Even more usefully, the characteristic function, cO(λ) = det O - λI, where I is the identity operator, has roots (only) at the eigenvalues.

Over function spaces, eigenvalues become very important in quantum mechanics where they represent values of observables. For example, the (time-independent) Schrödinger equation can be represented as Hψ = Eψ where ψ is a wavefunction, and H is the Hamiltonian operator. Here, the eigenvalue, E, represents the energy of the system.

Some more little titbits about the eigen-family:

The eigenline is a line through the origin on which the eigenvectors fall.
i.e. - For an eigenvector
[x]
[y]
The eigenline is a line ax+by=0, where a and b are constants, that satisfies the values of x and y.

e.g. - For the eigenvector
[ 1 ]
[ 2 ]
The eigenline is 2x-y=0

A handy way of finding the eigenvalues of any 2x2 matrix A:
[a b]
[c d]
Is to solve the characteristic equation:
k^2 - (a+d)k + ad - bc = 0 (1)
To find any eigenvalues k. If there are no real solutions to this equation, there are no eigenvalues.

For each eigenvalue k found from solving (1), substitute it into the eigenvalue equation Ax = kx, where x is the vector
[x]
[y]

Ax = [a b] [x] = [ax + by]
[c d] x [y] [cx + dy]

kx = k[x] = [kx]
[y] [ky]

[ax + by] = [kx]
[cx + dy] [ky]

Solving the simultaneous equations
ax + by = kx
cx + dy = ky
For an eigenvalue k gives us an equation of the eigenline of the form
mx + ny = 0
And the eigenvector is any pair of values which satisfy the equation of the eigenline.

As the number of eigenvalues is given by the roots of the quadratic (1), the number can be found from the discriminant of a quadratic equation:
b^2 - 4ac
In (1)
a = 1
b = a+d
c = ad - bc

So the discriminant comes out as:
(a+d)^2 - 4(ad-bc)

After a brief bit of juggling, we can arrive at the following conclusions:

If a=d, the discriminant boils down to bc, ie:
bc < 0 There are no eigenvalues.
bc = 0 There is 1 eigenvalue.
bc > 0 There are 2 eigenvalues.

If a ≠ b, there are either 0 or 2 eigenvalues.

Log in or register to write something here or to contact authors.