Pair of differential equations satisfied by any holomorphic (complex analytic) function f: C -> C.

We can write f(z) = u(z) + iv(z) for real functions u,v, and then write z=x+iy. When

     f(z+h)-f(z)
lim  -----------
h->0      h
exists, the Cauchy-Riemann equations state that the partial derivatives satisfy:
uvuv
-- = -- ,   -- = - --
∂xyyx

A couple of additions. Complex functions are often expressed using polar coordinates. Thus we might write
f(z) = u(r,t) + iv(r,t)
Where t is the polar angle theta. The Cauchy Riemann equations now read:
(du/dr) = (-1/r) * (dv/dt)
(du/dt) = r * dv/dr

Also the Cauchy Riemann conditions are only necessary conditions for a function to be analytic. Continuity of the partial derivatives of u and v is the second condition required for sufficiency.

The Cauchy-Riemann Equations are the set of relationships between the partial derivatives of a complex-valued function of a complex variable. Whenever they hold at a point, the function is said to be differentiable at that point. If they hold in a disk D around some point (and the partial deriviatives are differentiable within that disk D) the function is said to be analytic at that point. If the Cauchy-Riemann equations are always true (i.e. the function is analytic on C) then the function is said to be an entire function. This is true for most basic elementary functions.

Rectangular Coordinates

Proof of the Cauchy-Riemann equations in rectangular coordinates.

Let f(z) be a complex-valued function of a complex variable

z = x + iy

Let

f(z) = u(z) + iv(z)

Define the derivative of f(z) to be

         lim   f(z + Δz) - f(z)
f'(z) =        ----------------
        Δz->0         Δz

Then we have

         lim   u(z + Δz) - u(z)     v(z + Δz) - v(z)
f'(z) =        ---------------- + i ----------------
        Δz->0         Δz                   Δz

Now we let

Δz = Δx + 0 i

This gives

         lim   u(x + Δx, y) - u(x, y)     v(x + Δx, y) - v(x, y)
f'(z) =        ---------------------- + i ----------------------
        Δx->0            Δx                         Δx

Recalling the definition of a partial derivative from vector calculus shows that

         ∂u     ∂v
f'(z) =  -- + i --
         ∂x     ∂x

Now we return to our previous equation in u, v, and z and let

Δz = 0 + i Δy

This gives

         lim   u(x, y + Δy) - u(x, y)     v(x, y + Δy) - v(x, y)
f'(z) =        ---------------------- + i ----------------------
        Δy->0           i Δy                       i Δy

Again recalling the definition of partial derivative, we see that

            ∂u   ∂v
f'(z) = - i -- + --
            ∂y   ∂y

         OR

         ∂v     ∂u
f'(z) =  -- - i --
         ∂y     ∂y

Observe that these are both equations for f'(z)! Thus we set the real and imaginary parts equal to one another and obtain the famous Cauchy-Riemann equations in rectangular form.

 ∂u   ∂v       ∂u     ∂v
 -- = --  and  -- = - --
 ∂x   ∂y       ∂y     ∂x

Q.E.D.

Polar Coordinates

Proof of the Cauchy-Riemann Equations in polar coordinates.

If we let

z = r (e^iθ);

Then we have the following important relationships which are familiar from analytic geometry

x = r cos θ
y = r sin θ

We proceed to finding polar equivalents of our partial derivatives

∂u   ∂u ∂θ   ∂u     -1
-- = -- -- = -- * -------
∂x   ∂θ ∂x   ∂θ   r sin θ

∂v   ∂v ∂r   ∂v     1
-- = -- -- = -- * -----
∂y   ∂r ∂y   ∂r   sin θ

Since we know these expressions are equal from the rectangular forms

∂u       ∂v
-- = - r --
∂θ       ∂r

Continuing with the next set gives

∂u   ∂u ∂r   ∂u     1
-- = -- -- = -- * -----
∂y   ∂r ∂y   ∂r   sin θ

  ∂v     ∂v ∂θ     ∂v     -1
- -- = - -- -- = - -- * -------
  ∂x     ∂θ ∂x     ∂θ   r sin θ

Again, we know these equations are equal, so

∂v     ∂u
-- = r --
∂θ     ∂r

Thus we have the Cauchy-Riemann Equations for polar coordinates as well!

∂u       ∂v     ∂v     ∂u
-- = - r -- and -- = r --
∂θ       ∂r     ∂θ     ∂r

Q.E.D.

References:

MathWorld.Wolfram.com

George Cain. Complex Analysis. http://www.math.gatech.edu/~cain/winter99/complex.html

The Cauchy-Riemann equations may look complicated, but like most mathematical equations they are really just a restatement of the obvious using fancy symbols.

Let's play a game. I've drawn an arrow on the ground in the direction we'll both call "forward". I'm going to watch where you walk, and wherever you go I'll go twice as far in the same direction. Let's call x and y, respectively, the amount you walked in the forward and leftward directions, and u and v will be the amount that I walked in these same directions. So, for example, if you took one step forward and one step to the left, then x=1 and y=1, and since I said I would match your direction and twice your distance I would take two steps forward and two to the left, so u=2 and v=2.

So having set the game up in this way, it isn't particularly deep to say that the number of steps I will walk forward, u, when you walk x steps forward is the same as the number I'll walk left, v, when you walk left the same distance, y. The first Cauchy-Riemann equation is just a fancy restatement of this obvious fact:

du   dv
-- = --
dx   dy


That is, divide the number of steps I took forward by the number you took forward, and you'll find it equals the number of steps I took left divided by the number you took left. In both cases the amount will be 2, since that was the rule of the game that I chose.

Now let's play a slight variation of this game. For every step you take, I will take one step in the direction left of where you walked. So if you take one step forward, I'll take one step left. And if you take one step left, I'll go one step backwards, since that's the direction that is left of left. Put another way, the amount of steps that I'll go left, v, when you go forward x steps is the opposite of the number of steps I'll go forward, u, when you go the same amount left, y, since in the latter case I'll be walking backwards. This gets us the second Cauchy-Riemann equation,

dv     du
-- = - --
dx     dy


These equations themselves are not very deep. What's deep is the fact that I was able to pick a rule for a game such that the amount I walked in was a fixed rotation of where you walked (that is, no rotation in the first game, 90 degrees of rotation in the second) times a constant value. So basically what I walked was just a multiple of what you walked --- if you're willing to expand your idea of multiplication to include rotations, which is exactly what complex numbers do! So in the first case I walked where you walked times 2, and in the second case I walked where you walked times i, where i means left. These were two of the simplest possible cases; I could just have easily as said that for every step you take forward, I'll take a step 45 degrees left of where you walked --- or in complex number form, I will walk (1+i)/sqrt(2) times where you walked. (That sqrt(2) has to be there due to the Pythagorean theorem -- walking one step forward and one step left means walking a total distance of sqrt(1+1)=sqrt(2), so I had to divide that factor out if I wanted to keep my distance the same as yours.)

I don't have to play a game that takes this special form. I could just as easily pick a rule that says, "for every step forward you take I'll walk two steps backwards, and for every step left you take I'll walk three steps left", and you will have a hard time reducing this to multiplication by a complex number. So the Cauchy-Riemann equations are not significant in and of themselves as much as they say something significant about the game --- that it can be reduced to multiplication by a complex number.

Of course, the equations aren't about games, they are about functions. But the same idea still basically holds; let z be the spot where you are standing, f be where I am standing, dz the (small) distance that you decide to walk, and df the (small) amount that f changes in response. Thus, df/dz is the rate at which I move with respect to how much you moved. This quantity may be different at different locations (that is, for different z), and this is why I said that df and dz had to be small. If df/dz can be given by a single complex number -- just like our games -- then f necessarily satisfies the Cauchy-Riemann equations, since these equations are nothing more than a re-statement of what it means for df/dz to be a single complex number, just as before the equations were nothing more than a re-statement of the rules of our games.

So like most things in mathematics, these equations really just tell us what we already knew, just in a different form.

The Cauchy-Riemann equations can be derived purely analytically via a simple calculation as above, but they are fundamentally geometric, albeit not in a way that's immediately obvious. The complex numbers can be defined algebraically; the imaginary unit is a root of the polynomial x2 + 1, but the real appreciation of the complex numbers is essentially geometric. In fact, that's exactly how the complex numbers first lost their mysterious aura that made down-to-earth mathematicians so uneasy and earned i the "imaginary" moniker. When Karl Friedrich Gauss introduced the complex plane and showed that the complex numbers can be understood in terms of the geometry of this plane, complex numbers entered mainstream mathematics and nobody had any reservations about what they really meant anymore. In an attempt to relive great moments in mathematical history, I shall now attempt to do something similar and to explain the Cauchy-Riemann equations as geometrically as I can, and in order to do that, I'm going to first need to explain a little more about the geometry of the underlying complex numbers. Linear algebra is a wonderful modern language for talking geometry, and the language I shall adopt below.

Complex multiplication as a linear operation

Let us identify, as Gauss did, the complex plane C with the real plane R2. A complex z number is uniquely determined by an ordered pair (x, y), i.e. z = x + yi, and vice-versa. Under this point of view, complex numbers are simply two-dimensional vectors of real numbers, and addition and subtraction of complex numbers can be accomplished simply by adding and subtracting corresponding entries of these vectors. You can also scale complex numbers by real numbers, and thus we have a bona fide vector space. So far, so good.

The real geometric magic of the complex numbers comes into play when we consider complex multiplication. Let z and w be arbitrary complex numbers, seen as elements of a vector space, r is some arbitrary real number, and let c be a fixed complex number. We shall view c as a linear operator. Really, all of these numbers are complex numbers, they are exactly the same sort of creatures, but I am giving them different roles right now in order to discover an underlying geometric truth. Because the complex numbers are a field, in particular they satisfy the associative and distributive properties that any field has to satisfy. This means that

c(z + w) = c(z) + c(w)
c(rz) = rc(z),

that is, c is indeed a linear transformation; it acts linearly on the vector space of complex numbers. This trivial observation has a nice consequence once we express c as a matrix in the standard basis of {(1,0), (0,1)} of R2, which corresponds to the real and imaginary units of C. Recall that the matrix of a linear transformation in a specific basis is found by seeing what the linear transformation does to such a basis. If c = a + bi and we keep in mind both views of complex numbers as a field and complex numbers as a vector space, then

              /[ 1 ]\                           [ a ]
     c(1) = c( [   ] ) = (a + bi)(1) = a + bi = [   ]
              \[ 0 ]/                           [ b ]
     
              /[ 0 ]\                            [-b ]
     c(i) = c( [   ] ) = (a + bi)(i) = -b + ai = [   ],
              \[ 1 ]/                            [ a ]

so that in the standard basis of R2, when c = a + bi is viewed as a linear transformation, its matrix in this basis is

          [ a  -b ]
     c =  [       ].
          [ b   a ]

We can take this one step further. Matrices can also be added and subtracted, not just multiplied, and because the determinant of this matrix is a2 + b2 which is zero only if both a and b are zero, all these matrices except the zero matrix are also invertible, which is to say that we can divide. This means that we can view complex numbers as matrices!

Theorem The complex numbers are isomorphic to the field of matrices of the form
          [ a  -b ]
          [       ].
          [ b   a ]

Proof: This is a routine calculation. We would have to prove that matrix addition and subtraction corresponds to complex addition and subtraction, but this is obvious, because these matrices are determined only by two entries which are added and subtracted componentwise just as complex numbers are added and subtracted. It is a little less obvious that matrix multiplication corresponds to complex multiplication, although the view of these matrices as linear maps helps. We can either multiply two arbitrary matrices of this form and see that the result does indeed coincide with the result of multiplying the two corresponding complex numbers, or we can view complex multiplication as a linear transformation, and recall that matrix multiplication corresponds to composition of linear transformations which in turn corresponds to complex multiplication. This second viewpoint also allows us to see that matrix inversion corresponds to complex division, or alternatively we could carry out the computations of matrix inversion and see that they correspond with the computations for complex division. These are all simple verifications, and I don't have any qualms in trusting my readers to carry them out for themselves if they really want to believe this isomorphism. ♦

We could say more here. We could analyse more about the geometry of the complex numbers by analysing the geometry of these matrices. For example, the norm of a complex number when seen as a vector, is square root of the determinants of these matrices, and if you'll notice the similar shape of these matrices with rotation matrices

          [ cos θ   -sin θ ]
          [                ],
          [ sin θ    cos θ ]

then we could also talk about the relationship between complex multiplication and rotations of the complex plane. I shall not pursue these ideas, however, because they are not essential in order to understand the geometry of the Cauchy-Riemann equations. Complex numbers are matrices, and this will do for now.

Complex differentiation as a Jacobian

Let us now talk about functions. Since the complex numbers are a normed field (the norm of a complex number is simply the distance to the origin when viewing complex numbers as vectors), it makes sense to define complex differentiation as follows.

Definition Let f : CC be a complex-valued function. We say that f is differentiable at c if the limit
             f(c+h) - f(c)
       lim  --------------
       h→0        h
exists. In such a situation, we let f'(c) denote such a limit and call it the derivative of f at c.

The definition is exactly the same as the one for real functions. That's what makes it so magical. It seems to be asking a function to be exactly the same as what a corresponding real-valued function would be, but it turns out that the geometry of the complex numbers yields a great bounty out of this definition that the real numbers cannot give. Part of this bounty is the Cauchy-Riemann equations.

Let us go back to the identification of the complex plane C with the real two-dimensional vector space R2. In such a situation, then we are talking about functions f : R2R2, and it also makes sense to talk about differentiation for those functions. For a function of two variables that takes values in two variables, the corresponding idea to the derivative is the Jacobian matrix. Then if we write a complex-valued function f(z) as a function of two variables with two component functions u and v, that is, f(x,y) = (u(x,y ), v(x,y)), the derivative of f is the Jacobian matrix

          [ux uy] 
          [     ],
          [vx vy]   

where I have used subindices in order to represent partial differentiation with respect to a particular variable.

This is the crux of the matter. It turns out that if we view the complex plane as R2, then this Jacobian derivative has to be the same as the complex derivative f'(z). They are both linear transformations (see above why complex numbers can be seen as linear transformations), and they are both the best linear approximation to f, and there is only one derivative, only one best linear approximation. It means that these two objects must be the same. The Jacobian matrix has to be a complex number:

          [ux uy]       [ a  -b ]
   f'(z)= [     ]   =   [       ].
          [vx vy]       [ b   a ]

In other symbols, using now a different notation to denote partial differentiation,

uvuv
      --  =  --        -- = - -- ,
      ∂xyyx  

which are exactly the Cauchy-Riemann equations.

It's remarkable how it all fits in together. Complex multiplication gives us a way to view to view complex numbers on the real plane as linear transformations. Linear transformations have a matrix, and the structure of complex multiplication endows this matrix with a special structure of its own. A complex derivative is also a linear transformation, therefore must also have the same matrix structure. But saying that the complex derivative has the same matrix structure as any other complex number is exactly what the Cauchy-Riemann equations are saying.

Log in or register to write something here or to contact authors.