By their transformations, thou shalt know them.
--Unknown*

If it looks like a duck, walks like a duck, and quacks like a duck, it's a duck.
--Ancient duck proverb

What is a tensor? Some would say that a tensor is merely an array or set of numbers, represented by using an indexed notation, like T = {Tijkl}. This definition is not only unenlightening, it is simply wrong. I will attempt to give a better (if not comprehensive) description.

At the risk of sounding completely circular, a tensor is a mathematical object which transforms "like a tensor" under rotations and parity** (reflections through the origin). In an attempt to decrypt what I've just said, I offer a few examples, classified by rank.

A rank-1 tensor is a vector. That is, it is an object which transforms like a vector under rotations and parity. By this I mean, with any rotation or parity transformation of a coordinate system, there is associated a particular orthogonal matrix, R. All vectors should transform like v → Rv under this transformation. For example, under parity transformations, v → -v.

Why is this different from saying a vector is just a column of numbers? Because most columns of numbers don't have this property! A good counterexample is angular momentum. A simple definition of angular momentum would be L = r × p (where "×" refers to the cross product). It is the cross product of two vectors, which would seem to be a good candidate for a vector. How does it transform under parity?

Since r → -r and p → -p, L → (-r) × (-p) = r × p = L.

In other words, L does not change under parity! However, it does transform properly under rotations, and for this reason, it is called a pseudovector. Another example of a pseudovector is a magnetic field.

Moving backwards down the rank ladder, a rank-zero tensor is known as a scalar. One may easily think of scalars simply as single numerical values (as opposed to a set or array of numbers), but once again, a scalar is not just any number. It must "transform" like a scalar, which really means it must not change at all under rotations and parity. Scalars can be created by taking the dot product of vectors. Since both transform like vectors,

vTv → vTRTRv = vTR-1Rv = vTv,

so the dot product is invariant. The reason "scalars" are often thought of as "just numbers" is because normally we don't think of "numbers" as things that transform under rotations. Allow me to give a counterexample:

When you take the dot product of a vector with a pseudovector, you get an object which transforms like a scalar under rotations, but picks up a minus sign under parity. Any number which transforms this way is called a pseudoscalar. For example, the dot product of a particle's momentum with its spin is a pseudoscalar, known as the particle's helicity.

To generalize to higher-rank tensors, we need only put together combinations of smaller-rank tensors. Roughly, a rank-n tensor is an object which transforms like n vectors. Thus, a rank-2 tensor transforms like two vectors, and so on. So how do two vectors transform? In order to make a transformation on two vectors, each vector must be transformed, requiring two of the same orthogonal matrix (T → RMRT in the case where M is a matrix, one possible type of rank-2 tensor).

By analogy, there are also pseudotensors of every rank, and you can probably guess how they transform (like a tensor, but with an additional minus sign under parity). The most common example of a pseudotensor is the totally antisymmetric epsilon "tensor" (εijk being the rank-3 example).

The point is, most "arrays of numbers" don't transform in any special way. It is those sets of numbers which have a particular relationship under transformations that achieve special titles like "vector", "scalar" and "tensor".


*Unknown meaning unknown to me. If anyone can cite this, I would appreciate it.
**Usually in mathematics, the requirement that tensors properly transform under parity (and hence the distinction between tensors and pseudotensors) is ignored. This just implies a slightly different definition, which offers fewer "nice" counterexamples. Actually, while I'm on this footnote, I might as well note that since I didn't explicitly define my vector space, "rotation" should be replaced by "coordinate transformation", because we can't do a completely general treatment in terms of orthogonal transformations. However, only doing orthogonal transformations allows me to tiptoe around the subject of covariant vs. contravariant tensor components, which is good news for me. So, for the sake of simplicity and/or clarity, let's just talk about O(N) rotations in nice, flat, euclidean N-dimensional space, and tack it on to the list of things I've swept under the rug. In any case, everything I say here generalizes nicely to curved manifolds and nonorthogonal transformations. For a general treatment, see the above writeup by grey knight, if you dare...
There are "better" definitions for angular momentum, but this one is fine.