A set of equations that (typically) share the same set of variables. A single equation is usually all that's necessary to find a unique value for a variable. If you have multiple variables in an equation, the values won't be unique. However, if you have multiple equations that use those same variables, you may be able to find a set (or finite number of sets) of unique values for those variables.

Let's consider the simplest example: linear equations. Each equation can be graphed as a line (or a plane, or a higher-dimensional point set). Oops, we're drifting from simple. Let's say we have the two-variable linear equation *x - y* = 0. If we draw this on a graph, we get a diagonal line from the lower left corner to the upper right corner of the graph. Any point on the line is a solution for *x* and *y*; for example, (-1,-1), (1,1), (2,2), etc. This equation has an infinite number of solutions.

Now we get more information in the form of a second linear equation: 2*x* + *y* = 4. If we draw this on the same graph, we get a line going from the lower right to the upper left (but with a steeper slope than the previous equation. Again, this equation has an infinite number of solutions for *x* and *y*: (0,4), (1,2), (2,0), and so on.

Separately, each equation has an infinite number of solutions but together, there is a unique solution. Since the lines aren't parallel, they intersect, and the point of intersection has a value of *x* and a value of *y* that fits on both lines.

What we just did was find a graphical solution to a pair of linear equations. One nice thing about linear equations is that they can be solved by algebraic substitution. We can rearrange an equation so one variable is expressed in terms of another variable. Then we plug that expression into the other equation, reducing it to a single variable, and solve it. We can get the same result by adding equations together.

When dealing with a large number of simultaneous (linear) equations, it's often convenient to put them in matrix format. The basic approach is to put the coefficients of the equation into a matrix, which, when multiplied by a column vector of variables, produces a column vector of constants. We can take the inverse of the matrix (got complicated rather fast, didn't it?), multiply it by both sides of the equation, and get the values of the variables immediately.

Are we always guaranteed a unique solution? No. Imagine if in the earlier example the lines were parallel. In this case, there would be no solution. Or, imagine if the equations described the same line. In that case, there would be an infinite number of solutions. When we use matricies, there are tricks for determining quickly the number of solutions. Coincidentally enough, these are called determinants.

Incidentally, these techniques cover only *linear* equations. When the equations are nonlinear, things turn ugly.