Cumulative Distribution Function - NayiPathshala

Breaking

Total Pageviews

Loading...

Search Here

1/08/2018

Cumulative Distribution Function

Cumulative Distribution Function

We want to introduce another important, general concept in this chapter.
Definition. Let X be a random variable, discrete or continuous. We define F
to be the c umulative distribution function of the random variable X (abbreviated
as cdf) where F(x) = P(X ≤  x).

Theorem 2. (a) If X is a discrete random variable,

\[F(x)=\sum\limits_{j}{p({{X}_{j}})}\]

where the sum is taken over all indices j satisfying Xj  x.

(b) If X is a continuous random variable with pdf f,

\[F(x)=\int_{-\infty }^{x}{f\left( s \right)}ds\]

Proof" Both of these results follow immediately from the definition.
EXAMPLE 4.14. Suppose that the random variable X assumes the three values
0, I, and 2 with probabilities!, t. and t. respectively. Then
F(x) = 0 if x < 0,
= 1/3 if 0 ≤ x< 1,
=1/2 if 1  x < 2,
= 1 if x2.

(Note that it is very important to indicate the inclusion or exclusion of the endpoints in describing the various intervals.)

EXAMPLE 15. Suppose that X is a continuous random variable with pdf
f(x) = 2x, 0 < x < 1,
= 0, elsewhere.
Hence the cdf  F is given by
F(x) = 0 if x ≤ 0

\[\begin{align}
  & \int_{0}^{x}{2sds}={{x}^{2}} \\
 & if0
 & =1 \\
 & ifx>1 \\
\end{align}\],

The graphs obtained in Figs. 4.11 and 4.12 for the cdf's are (in each case) quite
typical in the following sense.
(a) If X is a discrete random variable with a finite number of possible values,
the graph of the cdf F will be made up of horizontal line segments (it is called a
step function). The function F is continuous except at the possible values of X,
namely, xi. ... , Xn• At the value Xj the graph will have a "jump" of magnitude
p(Xj) = P(X = Xj).
(b) If X is a continuous random variable, F will be a continuous function for
all x.
(c) The cdf F is defined for all values of x, which is an important reason for
considering it.
There are two important properties of the cdf which we shall summarize in the
following theorem.

Theorem 3. 
(a) The function F is non decreasing. That is, if x1   x2, we have
F(x1≤ F(x2).
(b) lim x-+- F(x) = 0 and limx-+∞ F(x) = 1. [We often write this as
F(-) = O,F() = l.]
Proof" (a) Define the events A and B as follows: A = {X   x1 } , B ={X   x2} . Then, since x1  x2, we have AB and by Theorem 1.5, P(A) P(B), which is the required result.
(b) In the continuous case we have
\[F\left( -\infty  \right)=\underset{x\to -\infty }{\mathop{\lim }}\,\int_{-\infty }^{x}{f(s)ds=0}\]
\[f\left( \infty  \right)=\underset{x\to \infty }{\mathop{\lim }}\,\int_{-\infty }^{x}{f\left( s \right)ds=1}\]

In the discrete case the argument is analogous.
The cumulative distribution function is important for a number of reasons.
This is true particularly when we deal with a continuous random variable, for in
that case we cannot study the probabilistic behavior of X by computing P(X = x).
That probability always equals zero in the continuous case. However, we can ask

about P(X x) and, as the next theorem demonstrates, obtain the pdf of X.

Theorem 4. 
(a) Let F be the cdf of a continuous random variable with pdf f Then  
f(x) = d/dx F (x),
for all x at which F is differentiable.
(b) Let X be a discrete random variable with possible values Xi, x2, • • • ,
and suppose that it is possible to label these values so that x1 < x2 < • • •

Let F be the cdf of X. Then

\[p\left( {{x}_{j}} \right)=P\left( X={{x}_{j}} \right)=F\left( {{x}_{j}} \right)-F\left( {{x}_{j-1}} \right)\]

proof:
\[(a)F\left( x \right)=p\left( X\le x \right)=\int_{-\infty }^{x}{f\left( s \right)}\]
Thus applying the fundamental
theorem of the calculus we obtain, F'(x) = f(x).

(b) Since we assumed x1 < x2 < . .. , we have
F(xj ) = P(X =xj U X = xj-1U · · · U X = x1)
= p(j) + p((j - 1) + · · · + p(1).
F(xj-1) = P(X = xj-1 U x =xj-2 U ... U x = x1)
= p(j - 1) + p(j - 2). + · · · + p(1).

Hence F(xi) - F(xj-1) = P(X = xj) = p(xj).

Note: Let us briefly reconsider (a) of the above theorem. Recall the definition of the

derivative of the function F:
\[{F}'(x)=\underset{h\to 0}{\mathop{\lim }}\,\frac{F\left( x+h \right)-F\left( x \right)}{h}=\underset{h\to {{0}^{+}}}{\mathop{\lim }}\,\frac{p\left( X\le x+h \right)-P(X\le x)}{h}=\underset{h\to {{0}^{+}}}{\mathop{\lim }}\,\frac{1}{h}[P(x

Thus if h is small and positive,

\[{F}'(x)=f(x)\cong \frac{P(x

That is,f(x) is approximately equal to the "amount of probability in the interval (x, x + h] per length h." Hence the name probability density function.

EXAMPLE 16.
Suppose that a continuous random variable has cdf F given by
     F(x) 0,     0,
     =1- e-x   x >0.
Then F'(x) e-x for x > 0, and thus the pdf f is given by
      f(x) e-x,             x  0,
       =0,                   elsewhere.
Note: A final word on terminology may be in order. This terminology, although not quite uniform, has become rather standardized. When we speak of the probability distribution of a random variable X we mean its pdf f if X is continuous, or its point probability function p defined for xi, x2, . . . if X is discrete. When we speak of the cumulative distribution function, or sometimes just the distribution Junction, we always

mean F, where F(x) = P(X ≤ x).

No comments:

Post a Comment