Moment Generating Function - NayiPathshala

Breaking

Total Pageviews

Loading...

Search Here

1/20/2018

Moment Generating Function

 Moment Generating Function

The moment generating. function (m.g.f.). of a, random variable X (about·origin) having the probability function f(x) is given by

Mx (t) = E(etx) = ∫etxf(x)dx 
                                             (for continuous  probability function)
∑etxf(x)
                                             (for discrete random variable )
the integration or summation being extended to the entire range of x, t being the real parameter and it is being assumed that the right-hand side of above equation is absolutely  convergent for some positive number h such that  -h <t < h Thus

Mx (t) = E(etx) = E[1 + tx + t2X2 /2! + ....... + t'X' /r! ]
1+tE(X) + t2 /2!E(X2) + t' /r!E(X') + ........

is the rth moment of X about origin. Thus we see that the coefficient of  t' /r! in  Mx (t) gives Î¼r'  (above origin). since Mx (t) generates moments, it is known as moment generties function. Differentiating  w.r.t. t and then' putting 1 = 0,    we get

[ dr/dt'{Mx (t) } ]r=0  =[{μr' /r!}r! + Î¼r+1't + Î¼r+2' t2 /2! +......]r=0

μr' =[ dr/dt'{Mx (t) } ]r=0 

In general, the moment generating function of X about the point X. a is defined as
 Mx (t) (about X = a) = E [et(x+1) ]
                         =E [1 + t(X-a) + t2 /2!(X-a)2 + .......... + t2 /r!(X-a)r ]
                        = 1+ tμ'r+ t2 /2!μ'2+.................+ tr /r!μ'+............

where Î¼'r.= E {(X - a)2}, is the rth moment-about the point X - a.

A Discrete Example

Suppose a discrete PDF is given by the following table.

X=xP(X=x)
X=0
0.25
We obtain the moment generating function  from the expected value of the exponential function.

We can then compute derivatives and obtain the moments about zero.

Uniqueness theorem of m.g.f

 The moment generating function of a distribution if it exists uniquelly determines the distribution. This implies that corresponding to a given probability distribution there is only one m.g.f provided it exists and corresponding to a given m.g.f, there is only one probability distribution. 
 Hence 
$M_{X}(t)$ = $M_{Y}(t)$ = X And Y are identically distributed.

Limitations 

Some of the important limitations on m.g.f are given below:

1) A random variable X may have no moments although its m.g.f exits. 
2) A random variable X can have m.g.f and some moments yet the m.g.f does not generate the moments. 
3) A random variable X can have all are some moments but m.g.f does exist except perhaps at one point.

Mean and Variance of m.g.f

The mean and variance are therefore
$\mu$    = x
= $M^{'}(0)$  

$\sigma^{2}$ = $x^{2}$ - $x^2$   

= $M^{''}(0)$ - $[M^{'}(0)]^{2}$.   

It is also true that

$\mu_n$ = $\sum_{(j=0)}^{n}$ $(n; j)$ (-1)$^{n-j}$ $\mu_{j}^{'}$($\mu_{1}^{'}$)$^{(n-j)}$,    

where $\mu_{0}^{'}$ = 1 and $\mu_{j}^{'}$ is the jth raw moment. 

Examples

Some problems are solved below on moment generating function:

Example 1: f the moments of a variate X are defined by $E(X^{r})$ = 0.6, r = 1, 2, 3, ..............

Show that P(X = 0) = 0.4, P(X = 1) = 0.6, P(X $\geq$ 2) = 0.

Solution: The m.g.f of variate X is

$M_{X}$(t) = $\sum_{r = 0}^{\infty}$ $\frac{t^{r}}{r!}$ $\mu_{r}$'

= 1 + $\sum_{r = 1}^{\infty}$(0.6)

= 0.4 + 0.6 $\sum_{r = 0}^{\infty}$ $\frac{t^{r}}{r!}$

= 0.4 + 0.6 e$^{t}$

but $M_{x}(t)$ = $E(e^{tX})$ = $\sum_{x = 0}^{\infty}e^{tx}$ $P(X = x)$

= $P(X = 0)$ + $e^{t}$.$P(X = 1)$ + $\sum_{x = 2}^{\infty}e^{tx}$ $P(X = x)$

P(X = 0) = 0.4, P(X = 1) = 0.6, P(X $\geq$ 2) = 0.

Hence proved

Example 2: Find the m.g.f of the random variable whose moments are
$M_{r}$' = $(r + 1)!2^{r}$

Solution:
The m.g.f is given by

$M_{x}$(t) = $\sum_{r = 0}^{\infty}$ $\frac{t^{r}}{r!}$ $\mu_{r}'$

= $\sum_{r  = 0}^{\infty}$ $\frac{t^{r}}{r!}$ $(r + 1)!2^{r}$

= $\sum_{r = 0}^{\infty}(r + 1)(2t)^{r}$

= 1 +  2 (2t) + 3(2t)$^{2}$ + 4 (2t)$^{3}$ + ............................

= ( 1 - 2t)$^{-2}$

No comments:

Post a Comment