A matrix may have any number of dimensions, and are commonly found with more than two. Applications, from processor design to image processing to plasma dynamics, rely on 4, 6, or 9 dimensional matrices.

Admittedly, in college, most students will only encounter two dimensional matrices in linear algebra classes or books about world-camera-eye-screen coordinate transforms for 3d rendering. It should be noted that the techniques used in linear algebra (conjugate gradient method, LU and Cholesky decomposition) can be useful with matrices of dimension *N*(where *N* is a number between 2 and ...For some reason, alt+236 doesn't work.).

Still, I think that multi-dimensional matrices deserve mention, as I wasted an entire year staring at 4d Toeplitz block Toeplitz matrices, costing me an unrecoverably large portion of my sanity.

To answer -brazil-'s comment on my wu (see below), a second order tensor is often written as a 3x3 matrix, as a tensor of order n requires 3

^{n}numbers to be represented, but a tensor is much more than just 3

^{n}numbers. The key element of a tensor is the transformation law of its components through coordinate systems. To softlink "tensor" wouldn't really help much, as -brazil- has made the same mistake in his tensor wu.

Of course, if this really is true, that matrices are much more limited than previously thought, then we should notify mathematicians and physicists the world over, as their works have been rendered worthless. grin