Suppose we have an associative multiplication operation "*" (a group always has one, but we can also find them in other places -- see e.g. the carry lookahead adder). Fix some `g' in the domain of the multiplication. Fast exponentiation lets us compute g^{n} in O(log `n`) multiplications. Since the size of `n` is log_{2} `n`, this is a linear time algorithm for exponentiation. The other direction -- computing `n` from g^{n} -- is the "discrete logarithm problem", and turns out to be a lot harder in many cases.

This makes discrete logarithms an extremely elegant one-way function. Its security is known to be at least as good as the security of factorisation (the other leading brand of one-way function) -- if you can break discrete logarithms, you can factorise. The other direction -- whether it is no harder than factorisation -- is not known. Discrete logarithms are also appealing because the input and output have such similar structure -- both are typically bit strings of the same length. Diffie-Hellman key exchange, as well as DSA and DSS, are based on this one-way function.

We'll be working in a group G, since the problem there is hard enough. Since we're fixing g once and for all, *only* elements {g^{n}: `n`∈**Z**} will be of interest^{1}. So only a cyclic group G, with g a generator (i.e. all elements of G are powers of g), shall be of interest. We consider only the finite case, so G is a finite cyclic group; assume |G|=N (N will be very large, of course, or the bad guys can solve the discrete logarithm problem merely by enumerating all elements of G).

As G is finite cyclic, it is isomorphic to C_{N}, the group of integers modulo N, with addition (module N) as the group operation, and the generator 1. Note that we've exchanged "multiplication" here for "addition", so "exponentiation" becomes "multiplication", and "logarithm" becomes "division". Put this way, the problem is TRIVIAL! For instance, say we take N=2^{32}, and work in G=C_{N} with the generator "1". This is just computer arithmetic on 32-bit words. Even worse, our exponentiation is just adding together `n` 1's, and (of course) `n`×1 ≡ `n` mod 2^{32}. If asked for the discrete log of 0x12345678, one can immediately see that adding together 0x12345678 1's will result in 0x12345678. OUCH. Picking other generators (any odd number will do for this N) doesn't make things much harder, either (we can find the multiplicative inverse modulo N of any odd number; multiplying by the multiplicative inverse solves the discrete log problem with that base).

To make discrete logs hard, we add a twist. The problem with C_{n} is that we know far too much about its structure. We'll still work with a cyclic group, but we'll obfuscate its structure in some way. G will still be isomorphic to C_{N}, but *we won't know how to compute the isomorphism*! Of course, we still need to know how to perform arithmetic on our G. It turns out that algebra supplies us with just what we need.

The easiest way is to pick some prime N=p, and work with F^{*}_{p}, the multiplicative group modulo p. This can be expressed by the numbers 1,...,p-1, and multiplication modulo p. And it turns out that this group is cyclic (more on that, and other cyclic groups, further on). Suppose g generates this group^{2}. Knowing `n`, we can easily compute g^{n} (in time linear in the size of `n`, see above!). But h=g^{n} is just some number 1,...,p-1, and (for large p) the binary representation of h offers no (known) hints to what `n` might be.

Another cyclic group we might pick is the multiplicative group of a finite field. An amazing fact is that the multiplicative group of a finite field is cyclic -- so we can use that. Note that a finite field has size p^{k} for some prime p; taking `k`=1, we're left with the previous case.

Here's some more about how we do this, but you'll need to know a bit of algebra for it. The structure of F^{*}_{pk} is quite hard, but we know enough about it to perform arithmetic there. The typical representation is as the extension field of degree `k` of the field of numbers module p, F_{p}. If we pick a primitive polynomial p(`x`) of degree `k` above F_{p}, then our cyclic group is just

{`x`^{n} mod p(`x`) : `n`=0,1,...,p^{k}-1},

with multiplication modulo p(`x`) as the group operation. When p=2, we can multiply (and therefore exponentiate) and divide in F^{*}_{pk} very easily, but discrete logarithms appear intractable.
#### IMPORTANT UPDATE

Don't believe everything you read on the WWW. Especially not if you don't trust the writer to know what he's talking about!

In fact, discrete logarithms in GF_{2k} is (barely) tractable using current technology. Lots of preprocessing is needed, but it's only needed *once* per group. Bruce Schneier recommends *NOT* to use this field in any applications. (GF_{p} for large prime p is still safe, though, if p-1 has a large factor).

Still other cyclic groups are used, but these two should give the basic taste of the problem.

Note that publishing the base g and the group G=<g> are thought to make no difference to the hardness of the problem. Even better, no useful attacks are known using precomputation -- so *everyone* can use the same g and G, and still consider themselves safe! Indeed, ISAKMP specifies a very small number of choices of groups and bases for Diffie-Hellman key exchange. The choice of group changes the *size* of N (there is a 512-bit group, a 1024-bit group, a 1536-bit group, etc.) and the computation time for key exchange. But cryptographers today consider discrete logarithms hard, even if everyone in the world used the same group. If n (the size of the cyclic group) has only small factors, then we can solve the discrete logarithm problem quite quickly. When using F_{p}^{*} which has n=p-1 elements, it is wise to choose a *safe prime* p which has a large factor, making the problem on it hard. So using a public prime p can actually make for *better* security than guessing your own.

One of the things that (we think and hope) makes discrete logarithms so hard to compute is the discrete and unordered setting. Continuous logarithms are, by comparison, extremely easy to compute. For instance, we have the extremely useful power series

ln(1+`x`) = `x` - `x`^{2}/2 + `x`^{3}/3 + `x`^{4}/4 - ...,

which lets us

evaluate (or at least approximate, but that's enough for practical algorithmic purposes) similar

logarithms in the

real numbers, in the

p-adic rationals, and other

topological groups where we have

norms and

convergence. For the

natural numbers, we can easily check if something is a power of g∈

**N** by using the ordering we have. If we had something similar for the discrete case, discrete logarithms would be doable. For instance,

computing the inverse (which see) is often very easy, because we have an appropriate norm, and can use some power series or similar trick. Discrete logarithms are hard, because (apparently!) the

analytic machinery of

calculus doesn't work.

#### Notes

- However, in a larger group G, the question ``
*is* h∈G a power of g?'' is also of interest (and hard). For current practical purposes, it is less important.
- It's even non-trivial to find a generator. However, it's somewhat easier to
*test* whether a given g generates F^{*}_{p} or not... *if* you know the factorisation of p-1. In any case, cryptosystems use well-known, well publicised choices of p and g; computing the discrete logarithm is (thought to be) hard enough, and (unless the choice of p and g is particularly bad!) picking random p's and g's probably adds little security.