Continuity Theorem - NayiPathshala

Breaking

Total Pageviews

Loading...

Search Here

1/22/2018

Continuity Theorem

(Levy's Continuity Theorem)

X(n)Xif and only ifϕX(n)(t)ϕX(t),tRd.
Proof. We assume that X(n)X. Since exp(itX)=costX+isintX we have that ϕ is continuous and bounded as a function of X, which together with implication 13 implies the pointwise convergence of the characteristic function.
Conversely, we assume that tRd,ϕX(n)(t)ϕX(t) and show that for any continuous function gthat is zero outside a bounded and closed set, we have E(g(X(n)))E(g(X)). Using the Portmanteau theorem, this implies that X(n)X. Since g is continuous on a compact set, it is uniformly continuous and we can select for all ϵ>0 a δ>0 such that xy<δ implies |g(x)g(y)|<ϵ.
Denoting by Z a N(0,σ2I) random vector that is independent of X and the sequence X(n), we have
|E(g(X(n)))E(g(X))|=|E(g(X(n)))E(g(X))+E(g(X(n)+Z))E(g(X(n)+Z))+E(g(X+Z))E(g(X+Z))||E(g(X(n)))E(g(X(n)+Z))|+|E(g(X(n)+Z))E(g(X+Z))|+|E(g(X+Z))E(g(X))|.
The first term above is bounded by 2ϵ since for σ sufficiently small
|E(g(X(n)))E(g(X(n)+Z))|E(|g(X(n)))E(g(X(n)+Z))|I(Zδ)+E(|g(X(n)))E(g(X(n)+Z))|I(Z>δ)E(ϵ)+2(supw|g(w)|)P(Z>δ)2ϵ.
The third term above is also bounded by 2ϵ due to a similar argument. It remains to show that the second term converges to zero: E(g(X(n)+Z))E(g(X+Z)). We will then have that |E(g(X(n)))E(g(X))|0, implying that E(g(X(n)))E(g(X) (for all continuous functions gthat are zero outside a bounded and closed set), which together with the Portmanteau theorem implies X(n)X.
We show below that E(g(X(n)+Z))E(g(X+Z)). We have
(*)E(g(X(n)+Z))=1(2πσ)dg(x+z)exp(zz/(2σ2))dzdFX(n)=1(2πσ)dg(u)exp((ux)(ux)/(2σ2))dudFX(n)=1(2πσ)dg(u)j=1dexp((ujxj)22σ2)dudFX(n)=1(2πσ)dg(u)j=1dσ2πexp(itj(ujxj)σ2tj2/2)dtjdudFX(n)=1(2π)dg(u)exp(it(ux)σ2tt/2)dtdudFX(n)=1(2π)dg(u)exp(ituσ2tt/2)ϕX(n)(t)dtdu,
where u=x+z. Note that we used Lemma 8.7.1 in the fourth equality and Proposition 8.7.2 in the last equality.
Since g is continuous and non-zero only on a closed and bounded set, g(u) may be made into a distribution by adding a constant to it and dividing by a constant. This implies that E(g(X(n)+Z))may be considered as an expectation over a two random vectors U having density c(g(u)+b) and Thave a Gaussian density. The argument of that expectation is the bounded function exp(itu)ϕX(n)(t), and so by the dominated convergence theorem for random variables (Proposition 8.3.1)
1(2π)dg(u)exp(ituσ2tt/2)ϕX(n)(t)dtdu1(2π)dg(u)exp(ituσ2tt/2)ϕX(t)dtdu.
Repeating the derivation in Equation (*) with X substituting X(n) we see that
E(g(X(n)+Z))=1(2π)dg(u)exp(ituσ2tt/2)ϕX(t)dtdu,
implying that E(g(X(n)+Z))E(g(X+Z)).
Note that Levy's continuity theorem above is similar to Proposition 2.4.2. The former equates convergence in distribution to convergence of characteristic functions. The latter equates convergence in distribution to convergence of the moment generating functions. An advantage of Levy's theorem is that in many cases the moment generating function does not exist, while the characteristic function always exist.
The following result shows a way to prove multivariate convergence in distribution using a variety of unvariate convergence results.

No comments:

Post a Comment