Convergence in probability implies convergence in distribution
$begingroup$
A sequence of random variables ${X_n}$ converges to $X$ in probability if for any $varepsilon > 0$,
$$P(|X_n-X| geq varepsilon) rightarrow 0$$
They converge in distribution if
$$F_{X_n} rightarrow F_X$$
at points where $F_X$ is continuous.
(There is another equivalent definition of converge in distribution in terms of weak convergence.)
It seems like a very simple result, but I cannot think of a clever proof.
probability probability-theory convergence random-variables weak-convergence
$endgroup$
add a comment |
$begingroup$
A sequence of random variables ${X_n}$ converges to $X$ in probability if for any $varepsilon > 0$,
$$P(|X_n-X| geq varepsilon) rightarrow 0$$
They converge in distribution if
$$F_{X_n} rightarrow F_X$$
at points where $F_X$ is continuous.
(There is another equivalent definition of converge in distribution in terms of weak convergence.)
It seems like a very simple result, but I cannot think of a clever proof.
probability probability-theory convergence random-variables weak-convergence
$endgroup$
1
$begingroup$
Have you tried the wikipedia article: en.wikipedia.org/wiki/… ? Most books on probability theory include a proof.
$endgroup$
– Gautam Shenoy
Nov 14 '12 at 4:54
$begingroup$
Oh, how come I didn't find it! It looks like something I have in mind. Thank you so much!
$endgroup$
– Hawii
Nov 14 '12 at 5:34
add a comment |
$begingroup$
A sequence of random variables ${X_n}$ converges to $X$ in probability if for any $varepsilon > 0$,
$$P(|X_n-X| geq varepsilon) rightarrow 0$$
They converge in distribution if
$$F_{X_n} rightarrow F_X$$
at points where $F_X$ is continuous.
(There is another equivalent definition of converge in distribution in terms of weak convergence.)
It seems like a very simple result, but I cannot think of a clever proof.
probability probability-theory convergence random-variables weak-convergence
$endgroup$
A sequence of random variables ${X_n}$ converges to $X$ in probability if for any $varepsilon > 0$,
$$P(|X_n-X| geq varepsilon) rightarrow 0$$
They converge in distribution if
$$F_{X_n} rightarrow F_X$$
at points where $F_X$ is continuous.
(There is another equivalent definition of converge in distribution in terms of weak convergence.)
It seems like a very simple result, but I cannot think of a clever proof.
probability probability-theory convergence random-variables weak-convergence
probability probability-theory convergence random-variables weak-convergence
edited Jan 20 at 13:41


Davide Giraudo
127k16153268
127k16153268
asked Nov 14 '12 at 4:08
HawiiHawii
6661514
6661514
1
$begingroup$
Have you tried the wikipedia article: en.wikipedia.org/wiki/… ? Most books on probability theory include a proof.
$endgroup$
– Gautam Shenoy
Nov 14 '12 at 4:54
$begingroup$
Oh, how come I didn't find it! It looks like something I have in mind. Thank you so much!
$endgroup$
– Hawii
Nov 14 '12 at 5:34
add a comment |
1
$begingroup$
Have you tried the wikipedia article: en.wikipedia.org/wiki/… ? Most books on probability theory include a proof.
$endgroup$
– Gautam Shenoy
Nov 14 '12 at 4:54
$begingroup$
Oh, how come I didn't find it! It looks like something I have in mind. Thank you so much!
$endgroup$
– Hawii
Nov 14 '12 at 5:34
1
1
$begingroup$
Have you tried the wikipedia article: en.wikipedia.org/wiki/… ? Most books on probability theory include a proof.
$endgroup$
– Gautam Shenoy
Nov 14 '12 at 4:54
$begingroup$
Have you tried the wikipedia article: en.wikipedia.org/wiki/… ? Most books on probability theory include a proof.
$endgroup$
– Gautam Shenoy
Nov 14 '12 at 4:54
$begingroup$
Oh, how come I didn't find it! It looks like something I have in mind. Thank you so much!
$endgroup$
– Hawii
Nov 14 '12 at 5:34
$begingroup$
Oh, how come I didn't find it! It looks like something I have in mind. Thank you so much!
$endgroup$
– Hawii
Nov 14 '12 at 5:34
add a comment |
3 Answers
3
active
oldest
votes
$begingroup$
A slicker proof (and more importantly one that generalizes) than the one in the wikipedia article is to observe that $X_n Longrightarrow X$ if and only if for all bounded continuous functions $f$ we have $E f(X_n) to E f(X)$. If you have convergence in probability then you can apply the dominated convergence theorem (recalling that $f$ is bounded and that for continuous functions $X_n to X$ in probability implies $f(X_n) to f(X)$ in probability) to conclude that $E |f(X_n) - f(X)| to 0$, which implies the result.
$endgroup$
add a comment |
$begingroup$
Here is an answer that does not rely on dominated convergence.
To prove convergence in distribution, we only need to establish that $E[f(X_n)]$ converges to $E[f(X)]$ for bounded continuous functions $f$.
By definition of the limit, we need to prove that for any $epsilon>0$, there some $n_0=n_0(epsilon)$ such that for all $n>n_0$ the inequality $| E[f(X_n)] - E[f(X)]| < epsilon $ holds.
- As suggested in another answer, the first step is to show that if $X_n$ converge to $X$ in probability then $f(X_n)$ also converges in probability to $f(X)$ for any continuous $f$.
- Let $f$ be any continuous function bounded by $K$. Take any $epsilon>0$ and show that
$$| E[f(X_n)] - E[f(X)]| le E[|f(X_n)] - E[f(X)|] le (epsilon/2) ; P(A_n^c) + K ; P(A_n)$$
where $A_n$ is the event ${ |f(X_n)] - E[f(X)| > epsilon /2 }$. - It remains to show that $P(A_n^c)le 1$ (obvious) and that for $n$ large enough, one has $P(A_n^c)le epsilon/(2 K)$ thanks to the convergence in probability established in 1.
$endgroup$
add a comment |
$begingroup$
Chris J.'s answer more or less is correct, but you require almost sure convergence to be able to apply dominated convergence. Fortunately, convergence in probability implies almost sure convergence along a subsequence, and the proof more or less can proceed as desired.
For more details, Kallenberg's Foundations of Modern Probability, First Edition, Lemma 3.7 is useful.
$endgroup$
1
$begingroup$
Or you could apply the bounded convergence theorem.
$endgroup$
– Calculon
Mar 15 '15 at 11:33
3
$begingroup$
Dominated convergence theorem also applies with convergence in probability.
$endgroup$
– perlman
Oct 29 '17 at 0:26
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f236955%2fconvergence-in-probability-implies-convergence-in-distribution%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
3 Answers
3
active
oldest
votes
3 Answers
3
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
A slicker proof (and more importantly one that generalizes) than the one in the wikipedia article is to observe that $X_n Longrightarrow X$ if and only if for all bounded continuous functions $f$ we have $E f(X_n) to E f(X)$. If you have convergence in probability then you can apply the dominated convergence theorem (recalling that $f$ is bounded and that for continuous functions $X_n to X$ in probability implies $f(X_n) to f(X)$ in probability) to conclude that $E |f(X_n) - f(X)| to 0$, which implies the result.
$endgroup$
add a comment |
$begingroup$
A slicker proof (and more importantly one that generalizes) than the one in the wikipedia article is to observe that $X_n Longrightarrow X$ if and only if for all bounded continuous functions $f$ we have $E f(X_n) to E f(X)$. If you have convergence in probability then you can apply the dominated convergence theorem (recalling that $f$ is bounded and that for continuous functions $X_n to X$ in probability implies $f(X_n) to f(X)$ in probability) to conclude that $E |f(X_n) - f(X)| to 0$, which implies the result.
$endgroup$
add a comment |
$begingroup$
A slicker proof (and more importantly one that generalizes) than the one in the wikipedia article is to observe that $X_n Longrightarrow X$ if and only if for all bounded continuous functions $f$ we have $E f(X_n) to E f(X)$. If you have convergence in probability then you can apply the dominated convergence theorem (recalling that $f$ is bounded and that for continuous functions $X_n to X$ in probability implies $f(X_n) to f(X)$ in probability) to conclude that $E |f(X_n) - f(X)| to 0$, which implies the result.
$endgroup$
A slicker proof (and more importantly one that generalizes) than the one in the wikipedia article is to observe that $X_n Longrightarrow X$ if and only if for all bounded continuous functions $f$ we have $E f(X_n) to E f(X)$. If you have convergence in probability then you can apply the dominated convergence theorem (recalling that $f$ is bounded and that for continuous functions $X_n to X$ in probability implies $f(X_n) to f(X)$ in probability) to conclude that $E |f(X_n) - f(X)| to 0$, which implies the result.
answered Nov 14 '12 at 20:55
Chris JanjigianChris Janjigian
4,97841935
4,97841935
add a comment |
add a comment |
$begingroup$
Here is an answer that does not rely on dominated convergence.
To prove convergence in distribution, we only need to establish that $E[f(X_n)]$ converges to $E[f(X)]$ for bounded continuous functions $f$.
By definition of the limit, we need to prove that for any $epsilon>0$, there some $n_0=n_0(epsilon)$ such that for all $n>n_0$ the inequality $| E[f(X_n)] - E[f(X)]| < epsilon $ holds.
- As suggested in another answer, the first step is to show that if $X_n$ converge to $X$ in probability then $f(X_n)$ also converges in probability to $f(X)$ for any continuous $f$.
- Let $f$ be any continuous function bounded by $K$. Take any $epsilon>0$ and show that
$$| E[f(X_n)] - E[f(X)]| le E[|f(X_n)] - E[f(X)|] le (epsilon/2) ; P(A_n^c) + K ; P(A_n)$$
where $A_n$ is the event ${ |f(X_n)] - E[f(X)| > epsilon /2 }$. - It remains to show that $P(A_n^c)le 1$ (obvious) and that for $n$ large enough, one has $P(A_n^c)le epsilon/(2 K)$ thanks to the convergence in probability established in 1.
$endgroup$
add a comment |
$begingroup$
Here is an answer that does not rely on dominated convergence.
To prove convergence in distribution, we only need to establish that $E[f(X_n)]$ converges to $E[f(X)]$ for bounded continuous functions $f$.
By definition of the limit, we need to prove that for any $epsilon>0$, there some $n_0=n_0(epsilon)$ such that for all $n>n_0$ the inequality $| E[f(X_n)] - E[f(X)]| < epsilon $ holds.
- As suggested in another answer, the first step is to show that if $X_n$ converge to $X$ in probability then $f(X_n)$ also converges in probability to $f(X)$ for any continuous $f$.
- Let $f$ be any continuous function bounded by $K$. Take any $epsilon>0$ and show that
$$| E[f(X_n)] - E[f(X)]| le E[|f(X_n)] - E[f(X)|] le (epsilon/2) ; P(A_n^c) + K ; P(A_n)$$
where $A_n$ is the event ${ |f(X_n)] - E[f(X)| > epsilon /2 }$. - It remains to show that $P(A_n^c)le 1$ (obvious) and that for $n$ large enough, one has $P(A_n^c)le epsilon/(2 K)$ thanks to the convergence in probability established in 1.
$endgroup$
add a comment |
$begingroup$
Here is an answer that does not rely on dominated convergence.
To prove convergence in distribution, we only need to establish that $E[f(X_n)]$ converges to $E[f(X)]$ for bounded continuous functions $f$.
By definition of the limit, we need to prove that for any $epsilon>0$, there some $n_0=n_0(epsilon)$ such that for all $n>n_0$ the inequality $| E[f(X_n)] - E[f(X)]| < epsilon $ holds.
- As suggested in another answer, the first step is to show that if $X_n$ converge to $X$ in probability then $f(X_n)$ also converges in probability to $f(X)$ for any continuous $f$.
- Let $f$ be any continuous function bounded by $K$. Take any $epsilon>0$ and show that
$$| E[f(X_n)] - E[f(X)]| le E[|f(X_n)] - E[f(X)|] le (epsilon/2) ; P(A_n^c) + K ; P(A_n)$$
where $A_n$ is the event ${ |f(X_n)] - E[f(X)| > epsilon /2 }$. - It remains to show that $P(A_n^c)le 1$ (obvious) and that for $n$ large enough, one has $P(A_n^c)le epsilon/(2 K)$ thanks to the convergence in probability established in 1.
$endgroup$
Here is an answer that does not rely on dominated convergence.
To prove convergence in distribution, we only need to establish that $E[f(X_n)]$ converges to $E[f(X)]$ for bounded continuous functions $f$.
By definition of the limit, we need to prove that for any $epsilon>0$, there some $n_0=n_0(epsilon)$ such that for all $n>n_0$ the inequality $| E[f(X_n)] - E[f(X)]| < epsilon $ holds.
- As suggested in another answer, the first step is to show that if $X_n$ converge to $X$ in probability then $f(X_n)$ also converges in probability to $f(X)$ for any continuous $f$.
- Let $f$ be any continuous function bounded by $K$. Take any $epsilon>0$ and show that
$$| E[f(X_n)] - E[f(X)]| le E[|f(X_n)] - E[f(X)|] le (epsilon/2) ; P(A_n^c) + K ; P(A_n)$$
where $A_n$ is the event ${ |f(X_n)] - E[f(X)| > epsilon /2 }$. - It remains to show that $P(A_n^c)le 1$ (obvious) and that for $n$ large enough, one has $P(A_n^c)le epsilon/(2 K)$ thanks to the convergence in probability established in 1.
edited Nov 17 '17 at 18:21
answered Sep 26 '17 at 4:11
jlewkjlewk
1115
1115
add a comment |
add a comment |
$begingroup$
Chris J.'s answer more or less is correct, but you require almost sure convergence to be able to apply dominated convergence. Fortunately, convergence in probability implies almost sure convergence along a subsequence, and the proof more or less can proceed as desired.
For more details, Kallenberg's Foundations of Modern Probability, First Edition, Lemma 3.7 is useful.
$endgroup$
1
$begingroup$
Or you could apply the bounded convergence theorem.
$endgroup$
– Calculon
Mar 15 '15 at 11:33
3
$begingroup$
Dominated convergence theorem also applies with convergence in probability.
$endgroup$
– perlman
Oct 29 '17 at 0:26
add a comment |
$begingroup$
Chris J.'s answer more or less is correct, but you require almost sure convergence to be able to apply dominated convergence. Fortunately, convergence in probability implies almost sure convergence along a subsequence, and the proof more or less can proceed as desired.
For more details, Kallenberg's Foundations of Modern Probability, First Edition, Lemma 3.7 is useful.
$endgroup$
1
$begingroup$
Or you could apply the bounded convergence theorem.
$endgroup$
– Calculon
Mar 15 '15 at 11:33
3
$begingroup$
Dominated convergence theorem also applies with convergence in probability.
$endgroup$
– perlman
Oct 29 '17 at 0:26
add a comment |
$begingroup$
Chris J.'s answer more or less is correct, but you require almost sure convergence to be able to apply dominated convergence. Fortunately, convergence in probability implies almost sure convergence along a subsequence, and the proof more or less can proceed as desired.
For more details, Kallenberg's Foundations of Modern Probability, First Edition, Lemma 3.7 is useful.
$endgroup$
Chris J.'s answer more or less is correct, but you require almost sure convergence to be able to apply dominated convergence. Fortunately, convergence in probability implies almost sure convergence along a subsequence, and the proof more or less can proceed as desired.
For more details, Kallenberg's Foundations of Modern Probability, First Edition, Lemma 3.7 is useful.
answered Feb 16 '14 at 3:14
Roy D.Roy D.
404211
404211
1
$begingroup$
Or you could apply the bounded convergence theorem.
$endgroup$
– Calculon
Mar 15 '15 at 11:33
3
$begingroup$
Dominated convergence theorem also applies with convergence in probability.
$endgroup$
– perlman
Oct 29 '17 at 0:26
add a comment |
1
$begingroup$
Or you could apply the bounded convergence theorem.
$endgroup$
– Calculon
Mar 15 '15 at 11:33
3
$begingroup$
Dominated convergence theorem also applies with convergence in probability.
$endgroup$
– perlman
Oct 29 '17 at 0:26
1
1
$begingroup$
Or you could apply the bounded convergence theorem.
$endgroup$
– Calculon
Mar 15 '15 at 11:33
$begingroup$
Or you could apply the bounded convergence theorem.
$endgroup$
– Calculon
Mar 15 '15 at 11:33
3
3
$begingroup$
Dominated convergence theorem also applies with convergence in probability.
$endgroup$
– perlman
Oct 29 '17 at 0:26
$begingroup$
Dominated convergence theorem also applies with convergence in probability.
$endgroup$
– perlman
Oct 29 '17 at 0:26
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f236955%2fconvergence-in-probability-implies-convergence-in-distribution%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
1
$begingroup$
Have you tried the wikipedia article: en.wikipedia.org/wiki/… ? Most books on probability theory include a proof.
$endgroup$
– Gautam Shenoy
Nov 14 '12 at 4:54
$begingroup$
Oh, how come I didn't find it! It looks like something I have in mind. Thank you so much!
$endgroup$
– Hawii
Nov 14 '12 at 5:34