# In R^{n}:

If you have some

vector **x** in a Euclidean

vector space, a convenient way to refer to it is as x

_{i}, where evaluation for different values of i gives the vector's components in some

orthonormal basis. When the index i appears only once, it is assumed free, so an equation like x

_{i}=y

_{i}+z

_{i} is true for all components, and so is equivalent to the vector equation

**x**=

**y**+

**z**. This then generalises to

matricies, where there are two free indicies, so that the matrix

**A** is written in the form A

_{ij}.

Suppose you want to express a dot product in this notation, ie.

**a**^{T}**b**=a

_{1}b

_{1}+a

_{2}b

_{2}+...+a

_{n}b

_{n} in

**R**^{n}. In this notation, this becomes simply a

_{i}b

_{i},

*where the fact that the index *i

* appears twice in a product means that it is summed over*. This is a consistent notation, as long as you now remember that A

_{ii} means the

trace of the matrix

**A**,

*not* its (i,i)

^{th} component. For example, here's how things look in the new notation:

# In pseudo-Riemannian manifolds:

In

Special Relativity you then encounter things like x

^{a}, which look a bit bizarre at first given how long you've been accustomed to seeing upstairs indicies used for exponentiation. Be assured it is not an exponent; the upstairs index means that the vector belongs to the dual space of vectors

read about dual spaces, then come back to reading this, if necessary. Dual spaces existed in the Euclidean case before, although they weren't of any real interest being essentially the same space - this is because the

metric which identifies the space with its dual was just the identity matrix.

In

relativity, things are slightly less simple - when you regard space and time together as a continuum - there must necessarily be a non trivial metric identifying the space with its dual (as else space and time would be indistinguishable). This is why you start seeing both x

_{a} and x

^{a}.

The next big difference what happen when you sum over indicies: when you have a dot product, you are measuring a length, and so will be making use of the metric. Length is defined by:

ds^{2}=g_{αβ}dx^{α}dx^{β}

This notation is slightly inconsistent here: the superscript 2 on the left hand side dentoes squaring, and not a vector component. Unforntunately, this is common notation, and most of the time you can tell which is which.

But could write is as:

ds^{2}=dx_{α}dx^{α}

where we see now that the dual space is acting on the,

er, not dual one, giving a scalar. Because the two are exactly the same thing, dual vectors are identified with vectors by g

_{αβ}a

^{β}=a

_{α}. Note that in all these constructions, upstairs indicies are only summed together with downstairs ones; this is true in general, since contraction on an index necessarily identifies a space with is dual. Also note that in all terms of all expressions, upstairs and downstairs indicies match up.

All this then generalises to higher tensors, allowing things like:

R_{αβ} - ½g_{αβ}R^{γ}_{γ}=8πT_{αβ}

Which is a

hard equation to solve.