# The Moment Generating Function of the Binomial Distribution

## Preview text

MSc. Econ: MATHEMATICAL STATISTICS, 1996

The Moment Generating Function of the Binomial Distribution

Consider the binomial function

(1)

b(x; n, p) = n! pxqn−x with q = 1 − p.

x!(n − x)!

Then the moment generating function is given by

n xt

n!

x n−x

Mx(t) = e x!(n − x)! p q

x=0

(2) = n (pet)x n! qn−x x=0 x!(n − x)!

= (q + pet)n,

where the ﬁnal equality is understood by recognising that it represents the expansion of binomial. If we diﬀerentiate the moment generating function with respect to t using the function-of-a-function rule, then we get

dMx(t) = n(q + pet)n−1pet

(3)

dt

= npet(q + pet)n−1.

Evaluating this at t = 0 gives

(4)

E(x) = np(q + p)n−1 = np.

Notice that this result is already familiar and that we have obtained it previously by somewhat simpler means.
To ﬁnd the second moment, we use the product rule

(5) to get

duv = u dv + v du dx dx dx

d2Mx(t) = npet (n − 1)(q + pet)n−2pet + (q + pet)n−1 npet

dt2

(6)

= npet(q + pet)n−2 (n − 1)pet + (q + pet)

= npet(q + pet)n−2 q + npet .

1

MSc. Econ: MATHEMATICAL STATISTICS: BRIEF NOTES, 1996

Evaluating this at t = 0 gives E(x2) = np(q + p)n−2(q + np)
(7) = np(q + np).

From this, we see that (8)

V (x) = E(x2) − E(x) 2 = np(q + np) − n2p2 = npq.

Theorems Concerning Moment Generating Functions
In ﬁnding the variance of the binomial distribution, we have pursed a method which is more laborious than it need by. The following theorem shows how to generate the moments about an arbitrary datum which we may take to be the mean of the distribution.

(9)

The function which generates moments about the mean of a ran-

dom variable is given by Mx−µ(t) = exp{−µt}Mx(t) where Mx(t)

is the function which generates moments about the origin.

This result is understood by considering the following identity: (10) Mx−µ(t) = E exp{(x − µ)t} = e−µtE(ext) = exp{−µt}Mx(t).

For an example, consider once more the binomial function. The moment generating function about the mean is then

(11)

Mx−µ(t) = e−npt(q + pet)n = (qe−pt + pete−pt)n
= (qe−pt + peqt)n.

Diﬀerentiating this once gives

(12)

dMx−µ(t) = n(qe−pt + peqt)n−1(−pqe−pt + qpeqt).

dt

At t = 0, this has the value of zero, as it should. Diﬀerentiating a second time according to the product rule gives

(13)

d2Mx−µ(t) = u(p2qe−pt + q2peqt) + v du ,

dt2

dt

2

MSc. Econ: MATHEMATICAL STATISTICS, 1996

where (14)

u(t) = n(qe−pt + peqt) and v(t) = (−pqe−pt + qpeqt).

At t = 0 these become u(0) = n and v(0) = 0. It follows that

(15)

V (x) = n(p2q + q2p) = npq(p + q) = npq,

as we know from a previous derivation. Another important theorem concerns the moment generating function of
a sum of independent random variables:

(16)

If x ∼ f (x) and y ∼ f (y) be two independently distributed

random variables with moment generating functions Mx(t) and

My(t), then their sum z = x + y has the moment generating func-

tion Mz(t) = Mx(t)My(t).

This result is a consequence of the fact that the independence of x and y implies that their joint probability density function is the product of their individual marginal probability density functions: f (x, y) = f (x)f (y). From this, it follows that

Mx+y(t) =

e(x+y)tf (x, y)dydx

xy

(17)

= extf (x)dx eytf (y)dy

x

y

= Mx(t)My(t).

3 