Conditional Expectations - NayiPathshala

Breaking

Total Pageviews

Loading...

Search Here

1/21/2018

Conditional Expectations

Conditional Expectations 

In the following chapters we shall have occasion to find the expected value of random variables in conditional distributions, or the expected value of one random variable given the value of another.

Definition 1 

Conditional expectation Let (X, Y) be a two-dimensional random variable and g( . , . ), a function of two variables. The conditional expectation of g(X, Y) given X = x, denoted by Î´[g(X, Y)I X = x], is defined to be
                    δ[g(X, Y)|X = x] = ∫lim -∞→∞g(x, Y)fy|x (y|x)dy               (1)
if (X, Y) are jointly continuous, and

                 Î´[g(X, Y)| X = x] =   ∑g(x, yj)fy|x(yj|x)                              (2)

if (X, Y) are jointly discrete, where the summation is over all. possible value of y
In particular, if g(x, y)=y, we have defined Î´[Y|X=x]=δ[Y|x]. Î´[Y|x] and δ[g(X, Y)|x] are functions of x. Note that this definition can be generalized to more than two dimensions. For example, let (Xl' ... , Xk , Yl , ... , Ym) be a (k + m)-dimensional continuous random variable with density
f Xl, ... , Xk , YI, ... , Ym(Xl' . " , Xk' Yl' ., ., Ym); then

                        δ[g(Xl ... , Xk , Yl ... , Ym)|Xl, ... , Xk] .
                       = ∫lim -∞→∞ ............∫lim -∞→∞g(x1 ... , Xk' Yl' ... , Ym) *f (Yl, ... , Ym|X1..........Xk) dyidym


EXAMPLE 1

In the experiment of tossing two tetrahedra with X, the number on the first, and Y, the larger of the two numbers, we found that 
                                     1/2          for y=2
fν|x (y|x)         =            1/4         for y=3
                                     1/4         for y=4

Hence Î´[Y| X = 2]=∑yfy|x(y|X=2)=2*1/2 + 3*1/4 + 4*1/4 = 11/4


EXAMPLE 2 

For fx , r(x, y) = (x + y)I(0, 1)(x)I(0, 1)(y), we found that

fy|x (y|x)(x+y)/(x+1/2)I(0, 1)(y)                                                                    for 0 < x < I  

Hence

δ[ Y|X = x] = ∫lim 0→1  y*(x+y)/(x+1/2)d Y = 1/(x+1/2) (x/2 + 1/3)          for 0 < x < 1

As we stated above, Î´[g( Y) | x] is, in general, a function of x, Let us denote it by hex); that is, hex) = Î´[g(Y)|x]. Now we can evaluate the expectation of h(x), a function of X, and will have Î´[h(X)] = Î´[δ[g(Y)| X]]. This gives us

δ[δ[g(Y)|X]] = Î´[h(X)] = ∫lim -∞→∞ h(x)fx(x) dx
                                      = ∫lim -∞→∞ Î´[g(Y)|x]fx(x) dx 
                                      =∫lim -∞→∞ [∫lim -∞→∞g(Y)fν|x (y|x)dY]fx(X) dx 
                                      = ∫lim -∞→∞lim -∞→∞ g(Y)fν|x (y|x)fxxdydx
                                      =∫lim -∞→∞lim -∞→∞g(Y)fx , v.(x , y) dy dx 
                                      = Î´[g(Y)]
Thus we have proved for jointly continuous random variables X and Y (the proof for X and Y jointly discrete is similar) the following simple yet very useful theorem.

Theorem 1

Let (X, Y) be a two-dimensional random variable; then

 Î´[g( Y)] = Î´[ Î´[g( Y) | X]],                                         (3)

and in particular

δ[ Y] = Î´[δ[ Y| X]].                                                      (4)         

 Definition 2 

Regression curve Î´[ Y| X = x] is called the regression curve of Y on x. It is also denoted by
μy|x=x =μy|x


Definition 3 

Conditional variance The variance of Y given X = x is defined by var [Y|X = x] = Î´[ Y2| X = x] - (δ[ Y|X = X])2.

Theorem 2 

var [Y] = Î´[var [Y|X]] + var [δ[ Y|X]].

PROOF

                    δ[var [Y|X]] = Î´[ Î´[Y2| X]] - Î´[(δ[ Y|X])2]
                                         = Î´[ Y2] - (δ[ Y])2 - Î´[(δ[ Y1 X])2] + (δ[ Y])2
                                         = var [ Y] - Î´[(δ[ Y|X])2] + (δ[δ[ Y|X]])2
                                         = var [Y] - var [δ[ Y|X]].

Let us note in words what the two theorems say. Equation (4) states that the mean of Y is the mean or expectation of the conditional mean of Y, and Theorem 2 states that the variance of Y is the mean or expectation of the conditional variance of Y, plus the variance of the conditional mean of Y.
We will conclude this subsection with one further theorem. The proof can be routinely obtained from Definition 1 and is left as an exercise. Also, the theorem can be generalized to more than two dimensions.

Theorem 3 

Let (X, Y) be a two-dimensional random variable and g1 () and g2() functions of one variable. Then                    (i) Î´'[g1( y) + g2( Y) | X = x] = Î´[g1 (Y) | X = x] +δ[g2( Y) | X = x].
                (ii) Î´[g1( y)g2(X) | X = x] = g2(X)δ[g1( Y) | X = x]. 

No comments:

Post a Comment