(linear algebra, analysis:)
Let M be a square matrix. If for some nonzero vector v and some scalar m Mv = mv (this is M "acting on" v by multiplication) then v is called an eigenvector (or "self vector") of M.

Note that all vectors in the direction of v (tv, for any scalar t) are also eigenvectors of M; but they're not really considered "different" for these purposes.

An eigenvector of a matrix A is a vector x such that:

Ax = λx

For some scalar λ, which is called the eigenvalue. That is, x does not change direction when multiplied by A. As an example case, consider the matrix:

  [ 2  1 ]
  [ 0  3 ]

The eigenvalues of this matrix are 2 and 3. For the first eigenvector, corresponding to eigenvalue 2, rearrange the above equation to:

(A - λI)x = 0

Which is just finding the null space of AI. That matrix is:

  [ 0  1 ]
  [ 0  1 ]

A basis for the null space of this matrix, and thus the eigenvalue, is:

  [ 1 ]
  [ 0 ]

Do the same for eigenvalue 3 to get:

  [ 1 ]
  [ 1 ]

These form a basis for the column space of A. In general, the eigenvectors will always form this basis, so long as A is non-degenerate.

For further information, see MIT's OpenCourseWare (http://ocw.mit.edu/18/18.06/f02/index.html), which contains a fantastic set of video lectures on the subject.

The equation Av = λv can be interpreted so that eigenvectors and eigenvalues can be thought of in a somewhat more intuitive fashion than simply their definitions in this equation. This is good news for physicists. We can express this equation in words by saying the following: take a vector, and transform it; if the new, transformed vector is simply a multiple of the old one, then that vector is an eigenvector. The multiplier is an eigenvalue. (N.B. "Eigenvector" literally means "own vector" (correct me if I'm wrong) in German, and we can now see why they are so called: the transformation of a vector creates a multiple of itself).

A nice way to visualise such a transformation is that of a rubber square with an arrow drawn on it (see http://www.physlink.com/Education/AskExperts/ae520.cfm). If you stretch the square along a particular axis, only arrows in certain directions will keep their direction; this is true no matter how hard you stretch the beast. We can then say that those directions (vectors) are eigenvectors for that transformation (stretching), and the eigenvalues (length of the new arrow compared to the old arrow) depend on how hard you stretch (which is inherent in the transformation matrix).

Log in or register to write something here or to contact authors.