There is a matrix A, calculate A^{n}.

There are two ways to go about this problem, matrix multiplication or the method I will outline here. You will have to make a decision about which way is faster, this **should** be faster for small symmetric matricies.

This method **will** work for real symmetric matrices, complex and non-symmetric matrices may not work.

From Brontosaurus: *"It will work for Hermitian matrices, i.e. the conjugate of the transpose is equal to the matrix".*

Say we have a matrix A.

Calculate the eigenvalues and hence the eigenvectors of the matrix A.

Normalize (or normalise) the eigenvectors.

*Basically, make it so all the vectors' lengths are 1 by multiplying by a scalar. Where for a vector *__x__ length = __x__.__x__

With the eigenvectors __e___{1}, __e___{2} . . . __e___{n}, and eigenvalues λ_{1}, λ_{2} . . . λ_{n}.

From the eigenvectors create a matrix E such that:

( . . . . )
E = ( __e___{1 }__e___{2} .. __e___{n })
( . . . . )

Calculate another matrix, E

^{ -1}. If the matrix is

real and

symmetric E

^{ -1} = E

^{T}, otherwise you will

have to work out E

^{ -1}.

Create another

matrix D such that:

( λ_{1} .. .. )
D = ( .. λ_{2} .. )
( .. .. λ_{n} )

Now, A = EDE

^{ -1} (I won't go into the

proof).

Now, say calculate A

^{2}.

A

^{2} = E D

^{2} E

^{ -1}.

Since D is such a simple matrix all the hard work is multiplying D

^{2} by E an then E

^{ -1}.

**N.B.** You

**cannot** say EE

^{ -1} =

Identity, since

matrix multiplication is not

commutative, i.e. AB ≠ BA (well, not always).

This may be better explained by an example, so here goes...

We have a nice

symmetric matrix A (makes the maths easier).

( 1 2 )
A = ( 2 1 )

If has eigenvectors, then there

exists an

__x__ and

eigenvalue λ such that A

__x__ = λ

__x__.

Re-arranging:

A

__x__ - λ

__x__ = 0

( A - λ I )

__x__ = 0 (

*where I is the identity matrix*)

If the above is

true then the

determinant of ( A - λ I ) =

__0__.

| (1-λ) 2 |
| 2 (1-λ) | = __0__
(1-λ)^{2} - (2 * 2) = __0__
(1-λ) = ± 2
Giving eigenvalues,
λ_{1} = -1
λ_{2} = 3
Now, to get the eigenvectors,
Substituting λ_{1} into ( A - λ I )
( 2 2 ) (__e___{11}) ( 0 )
( 2 2 ) (__e___{12}) = ( 0 )
Giving, __e___{1} = ( 1 )
( -1 )
Substituting λ_{2} into ( A - λ I )
( -2 2 ) (__e___{21}) ( 0 )
( 2 -2 ) (__e___{22}) = ( 0 )
Giving, (__e___{2}) = ( 1 )
( 1 )
Phew, now, I'll only do the short check on these eigenvectors.
__e___{1}.__e___{2} = Cos Θ / |__e___{1}| |__e___{2}|
__e___{1} . __e___{2} = (1*1) + (1*-1) = 0 = Cos Θ
Θ = 90^{o} as expected.
Now, to normalize (or normalise)
For __e___{1}, length = √2
so now, __e___{1} = ( 1 ) / √2
( -1 )
For __e___{2}, length = √2
so now, __e___{2} = ( 1 ) / √2
( 1 )
Now, to construct our matrices, E E^{ -1} and D
( 1/√2 1/√2 )
E = ( -1/√2 1/√2 )
( 1/√2 -1/√2 )
E^{ -1} = E^{T } = ( 1/√2 1/√2 )
( -1 0 )
D = ( 0 3 )
Phew, all done, now just a check.
A = E D E^{ -1}
( 1/√2 1/√2 ) ( -1 0 )
A = ( -1/√2 1/√2 ) ( 0 3 ) E^{ -1}
( -1/√2 3/√2 ) ( 1/√2 -1/√2 )
A = ( 1/√2 3/√2 ) ( 1/√2 1/√2 )
( (-1/2 + 3/2) ( 1/2 + 3/2) ) ( 1 2 )
A = ( ( 1/2 + 3/2) (-1/2 + 3/2) ) = ( 2 1 ) As expected.
Hurrah ! Now, use this method for big values of n !

Hope that you find this method useful, and sorry about the

notation,

HTML doesn't make it

easy for me.

I did this (correctly) on paper before hand, please inform me of any errors I made typing this up.

Thanks to Brontosaurus who pointed this out: *"2 vectors are orthonormal if their scalar product is 0 and they both have norm 1"*, this:*"if there is not an orthonormal basis of eigen vectors then you will have to invert E"*, and this: *"Your method is sure to work for any real symmetric (or complex hermitian) matrix, in other cases you may have to invert E"*.