The Cauchy-Riemann equations can be derived purely analytically via a
simple calculation as above, but they are fundamentally geometric,
albeit not in a
way that's immediately obvious. The complex numbers can be defined
algebraically; the imaginary unit is a root of the polynomial
*x*^{2} + 1, but the real appreciation of the complex
numbers is essentially geometric. In fact, that's exactly how the
complex numbers first lost their mysterious aura that made
down-to-earth mathematicians so uneasy and earned *i* the
"imaginary" moniker. When Karl Friedrich Gauss introduced the complex
*plane* and showed that the complex numbers can be understood
in terms of the geometry of this plane, complex numbers entered
mainstream mathematics and nobody had any reservations about what they
really meant anymore. In an attempt to relive great
moments in mathematical history, I shall now attempt to do something
similar and to explain the Cauchy-Riemann equations as geometrically
as I can, and in order to do that, I'm going to first need to explain
a little more about the geometry of the underlying complex
numbers. Linear algebra is a wonderful modern language for talking
geometry, and the language I shall adopt below.

Let us identify, as Gauss did, the complex plane **C** with the
real plane **R**^{2}. A complex *z* number is uniquely
determined by an ordered pair (*x*, *y*), i.e. *z* =
*x* + *yi*, and vice-versa. Under this point of view,
complex numbers are simply two-dimensional vectors of real
numbers, and addition and subtraction of complex numbers can be
accomplished simply by adding and subtracting corresponding entries of
these vectors. You can also scale complex numbers by real numbers, and
thus we have a *bona fide* vector space. So far, so good.

The real geometric magic of the complex numbers comes into play when
we consider complex multiplication. Let *z* and *w* be
arbitrary complex numbers, seen as elements of a vector space,
*r* is some arbitrary real number, and let *c* be a fixed
complex number. We shall view *c* as a linear operator. Really,
all of these numbers are complex numbers, they are exactly the same
sort of creatures, but I am giving them different roles right now in
order to discover an underlying geometric truth. Because the complex
numbers are a field, in particular they satisfy the associative and distributive
properties that any field has to satisfy. This means that

*c*(*z* + *w*) = *c*(*z*) + *c*(*w*)

*c*(*rz*) = *rc*(*z*),

that is, *c* is indeed a linear transformation; it acts linearly
on the vector space of complex numbers. This trivial observation has a
nice consequence once we express *c* as a matrix
in the standard basis of {(1,0), (0,1)} of **R**^{2},
which corresponds to the real and imaginary units of **C**. Recall
that the matrix of a linear transformation in a specific basis is
found by seeing what the linear transformation does to such a
basis. If *c* = *a* + *bi* and we keep in mind
both views of complex numbers as a field and complex numbers as a
vector space, then

/[ 1 ]\ [ *a* ]
*c*(1) = *c*( [ ] ) = (*a* + *b**i*)(1) = *a* + *b**i* = [ ]
\[ 0 ]/ [ *b* ]
/[ 0 ]\ [-*b* ]
*c*(*i*) = *c*( [ ] ) = (*a* + *b**i*)(*i*) = -*b* + *a**i* = [ ],
\[ 1 ]/ [ *a* ]

so that in the standard basis of **R**^{2}, when *c* =
*a* + *bi*
is viewed as a linear transformation, its matrix in this basis is

[ *a* -*b* ]
*c* = [ ].
[ *b* *a* ]

We can take this one step further. Matrices can also be added
and subtracted, not just multiplied, and because the determinant of
this matrix is *a*^{2} + *b*^{2} which is
zero only if both *a* and *b* are zero, all these matrices
except the zero matrix are also invertible, which is to say that we
can divide. This means that we can view complex numbers as matrices!

**Theorem** The complex numbers are *isomorphic*
to the field of matrices of the form
[ *a* -*b* ]
[ ].
[ *b* *a* ]

*Proof*: This is a routine calculation. We would have to prove that
matrix addition and subtraction corresponds to complex addition and
subtraction, but this is obvious, because these matrices are
determined only by two entries which are added and subtracted
componentwise just as complex numbers are added and subtracted. It is
a little less obvious that matrix multiplication corresponds to
complex multiplication, although the view of these matrices as linear
maps helps. We can either multiply two arbitrary matrices of this form
and see that the result does indeed coincide with the result of
multiplying the two corresponding complex numbers, or we can view
complex multiplication as a linear transformation, and recall that
matrix multiplication corresponds to composition of linear
transformations which in turn corresponds to complex
multiplication. This second viewpoint also allows us to see that
matrix inversion corresponds to complex division, or alternatively we
could carry out the computations of matrix inversion and see that they
correspond with the computations for complex division. These are all
simple verifications, and I don't have any qualms in trusting my
readers to carry them out for themselves if they really want to
believe this isomorphism. ♦

We could say more here. We could analyse more about the geometry of
the complex numbers by analysing the geometry of these matrices. For
example, the norm of a complex number when seen as a vector, is square
root of the determinants of these matrices, and if you'll notice the
similar shape of these matrices with rotation matrices

[ cos θ -sin θ ]
[ ],
[ sin θ cos θ ]

then we could also talk about the relationship between complex
multiplication and rotations of the complex plane. I shall not pursue
these ideas, however, because they are not essential in order to
understand the geometry of the Cauchy-Riemann equations. Complex
numbers are matrices, and this will do for now.

##
Complex differentiation as a Jacobian

Let us now talk about functions. Since the complex numbers are a
normed field (the norm of a complex number is simply the distance to
the origin when viewing complex numbers as vectors), it makes sense to
define complex differentiation as follows.

**Definition** Let *f* : **C** → **C** be a
complex-valued function. We say that *f* is
**differentiable** at *c* if the limit
*f*(*c*+*h*) - *f*(*c*)
lim --------------
*h*→0 *h*

exists. In such a situation, we let *f*'(*c*) denote such a
limit and call it the **derivative** of *f* at
*c*.

The definition is exactly the same as the one for real
functions. That's what makes it so magical. It seems to be asking a
function to be exactly the same as what a corresponding real-valued
function would be, but it turns out that the geometry of the complex
numbers yields a great bounty out of this definition that the real
numbers cannot give. Part of this bounty is the Cauchy-Riemann
equations.

Let us go back to the identification of the complex plane **C**
with the real two-dimensional vector space **R**^{2}. In
such a situation, then we are talking about functions *f* :
**R**^{2} → **R**^{2}, and it also makes
sense to talk about differentiation for those functions. For a
function of two variables that takes values in two variables, the
corresponding idea to the derivative is the *Jacobian*
matrix. Then if we write a complex-valued function *f*(*z*)
as a function of two variables with two component functions *u*
and *v*, that is, *f*(*x*,*y*) =
(*u*(*x*,*y* ), *v*(*x*,*y*)), the
derivative of *f* is the Jacobian matrix

[*u*_{x} *u*_{y}]
[ ],
[*v*_{x} *v*_{y}]

where I have used subindices in order to represent partial differentiation with
respect to a particular variable.

This is the crux of the matter. It turns out that if we view the
complex plane as **R**^{2}, then this Jacobian derivative
has to be the same as the complex derivative *f*'(*z*). They
are both linear transformations (see above why complex numbers can be
seen as linear transformations), and they are both the best linear
approximation to *f*, and there is only one derivative, only one
best linear approximation. It means that these two objects must be the
*same*. The Jacobian matrix has to be a complex number:

[*u*_{x} *u*_{y}] [ a -b ]
*f*'(*z*)= [ ] = [ ].
[*v*_{x} *v*_{y}] [ b a ]

In other symbols, using now a different notation to denote partial
differentiation,

∂*u* ∂*v* ∂*u* ∂*v*
-- = -- -- = - -- ,
∂*x* ∂*y* ∂*y* ∂*x*

which are exactly the Cauchy-Riemann equations.

It's remarkable how it all fits in together. Complex multiplication
gives us a way to view to view complex numbers on the real plane as
linear transformations. Linear transformations have a matrix, and the
structure of complex multiplication endows this matrix with a special
structure of its own. A complex derivative is also a linear
transformation, therefore must also have the same matrix
structure. But saying that the complex derivative has the same matrix
structure as any other complex number is exactly what the
Cauchy-Riemann equations are saying.