Moment Generating Functions (denoted m(t)) are used in mathematical statistics to determine the raw moments of probability density functions without having to do difficult integrals or tedious summations.

The moment generating function of a random variable Y with probability density function (pdf) f(y) or p(y), is the expected value of e^(yt).

In the case of a discrete random variable, whose value can take on only integers, this is: m(t) = SUM(i=0,infinity) e^(yt)p(y)

Where p(y) is the probability function that gives the probability of obtaining particular values of y. In the case of a continuous random variable with pdf f(y): m(t) = INTEGRAL(-Infinity, Infinity) e^(yt) f(y) dy

Once the mgf is obtained it is used in the following way:
To find the expected value of Y^k for a particular random variable Y, find the kth derivative of the mgf and set t=0.

Some mgfs for common distributions are:

Discrete Distributions

Binomial: m(t) = {p*e^t + (1-p)}^n, where n is the number of trials and p is the probability of success on a single trial.

Geometric: m(t) = (p*e^t)/(1-(1-p)*e^t), where p is the probability of the series ending.

Poisson: m(t) = exp{k(e^t – 1)}, where k is the expected value of Y based on some data.

Continuous Distributions:

Uniform: m(t) = (e^k – e^j)/t(k – j), where j and k are lower and upper limits of the distribution respectively

Normal: m(t) = exp{at + (t^2*b^2)/2}, where a is E(Y) or mean and b^2 is the variance.

Exponential: m(t) = 1/(1 – Bt), B is the mean.

Gamma: m(t) = (1 – Bt)^(-A), where A*B is he mean

Chi-Square: m(t) = (1 – 2t)^(-v/2), v is the degrees of freedom.

An example. Suppose we wish to find the expected value of Y^2, where Y has a exponential distribution with parameter 2. That is, B = 2. Then Y has mgf:
m(t) = 1/(1 – 2t)
Now we find the second derivative.
m’(t) = 2/(1 – 2t)^2
m”(t) = 8/(1 – 2t)^3

now, m”(0) = 8, so the expected value of Y^2 is 8, where Y ~ Exp(2).

Log in or register to write something here or to contact authors.