Convergence in probability implies convergence in distribution












2












$begingroup$


A sequence of random variables ${X_n}$ converges to $X$ in probability if for any $varepsilon > 0$,
$$P(|X_n-X| geq varepsilon) rightarrow 0$$



They converge in distribution if
$$F_{X_n} rightarrow F_X$$
at points where $F_X$ is continuous.



(There is another equivalent definition of converge in distribution in terms of weak convergence.)



It seems like a very simple result, but I cannot think of a clever proof.










share|cite|improve this question











$endgroup$








  • 1




    $begingroup$
    Have you tried the wikipedia article: en.wikipedia.org/wiki/… ? Most books on probability theory include a proof.
    $endgroup$
    – Gautam Shenoy
    Nov 14 '12 at 4:54












  • $begingroup$
    Oh, how come I didn't find it! It looks like something I have in mind. Thank you so much!
    $endgroup$
    – Hawii
    Nov 14 '12 at 5:34
















2












$begingroup$


A sequence of random variables ${X_n}$ converges to $X$ in probability if for any $varepsilon > 0$,
$$P(|X_n-X| geq varepsilon) rightarrow 0$$



They converge in distribution if
$$F_{X_n} rightarrow F_X$$
at points where $F_X$ is continuous.



(There is another equivalent definition of converge in distribution in terms of weak convergence.)



It seems like a very simple result, but I cannot think of a clever proof.










share|cite|improve this question











$endgroup$








  • 1




    $begingroup$
    Have you tried the wikipedia article: en.wikipedia.org/wiki/… ? Most books on probability theory include a proof.
    $endgroup$
    – Gautam Shenoy
    Nov 14 '12 at 4:54












  • $begingroup$
    Oh, how come I didn't find it! It looks like something I have in mind. Thank you so much!
    $endgroup$
    – Hawii
    Nov 14 '12 at 5:34














2












2








2


5



$begingroup$


A sequence of random variables ${X_n}$ converges to $X$ in probability if for any $varepsilon > 0$,
$$P(|X_n-X| geq varepsilon) rightarrow 0$$



They converge in distribution if
$$F_{X_n} rightarrow F_X$$
at points where $F_X$ is continuous.



(There is another equivalent definition of converge in distribution in terms of weak convergence.)



It seems like a very simple result, but I cannot think of a clever proof.










share|cite|improve this question











$endgroup$




A sequence of random variables ${X_n}$ converges to $X$ in probability if for any $varepsilon > 0$,
$$P(|X_n-X| geq varepsilon) rightarrow 0$$



They converge in distribution if
$$F_{X_n} rightarrow F_X$$
at points where $F_X$ is continuous.



(There is another equivalent definition of converge in distribution in terms of weak convergence.)



It seems like a very simple result, but I cannot think of a clever proof.







probability probability-theory convergence random-variables weak-convergence






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Jan 20 at 13:41









Davide Giraudo

127k16153268




127k16153268










asked Nov 14 '12 at 4:08









HawiiHawii

6661514




6661514








  • 1




    $begingroup$
    Have you tried the wikipedia article: en.wikipedia.org/wiki/… ? Most books on probability theory include a proof.
    $endgroup$
    – Gautam Shenoy
    Nov 14 '12 at 4:54












  • $begingroup$
    Oh, how come I didn't find it! It looks like something I have in mind. Thank you so much!
    $endgroup$
    – Hawii
    Nov 14 '12 at 5:34














  • 1




    $begingroup$
    Have you tried the wikipedia article: en.wikipedia.org/wiki/… ? Most books on probability theory include a proof.
    $endgroup$
    – Gautam Shenoy
    Nov 14 '12 at 4:54












  • $begingroup$
    Oh, how come I didn't find it! It looks like something I have in mind. Thank you so much!
    $endgroup$
    – Hawii
    Nov 14 '12 at 5:34








1




1




$begingroup$
Have you tried the wikipedia article: en.wikipedia.org/wiki/… ? Most books on probability theory include a proof.
$endgroup$
– Gautam Shenoy
Nov 14 '12 at 4:54






$begingroup$
Have you tried the wikipedia article: en.wikipedia.org/wiki/… ? Most books on probability theory include a proof.
$endgroup$
– Gautam Shenoy
Nov 14 '12 at 4:54














$begingroup$
Oh, how come I didn't find it! It looks like something I have in mind. Thank you so much!
$endgroup$
– Hawii
Nov 14 '12 at 5:34




$begingroup$
Oh, how come I didn't find it! It looks like something I have in mind. Thank you so much!
$endgroup$
– Hawii
Nov 14 '12 at 5:34










3 Answers
3






active

oldest

votes


















10












$begingroup$

A slicker proof (and more importantly one that generalizes) than the one in the wikipedia article is to observe that $X_n Longrightarrow X$ if and only if for all bounded continuous functions $f$ we have $E f(X_n) to E f(X)$. If you have convergence in probability then you can apply the dominated convergence theorem (recalling that $f$ is bounded and that for continuous functions $X_n to X$ in probability implies $f(X_n) to f(X)$ in probability) to conclude that $E |f(X_n) - f(X)| to 0$, which implies the result.






share|cite|improve this answer









$endgroup$





















    1












    $begingroup$

    Here is an answer that does not rely on dominated convergence.



    To prove convergence in distribution, we only need to establish that $E[f(X_n)]$ converges to $E[f(X)]$ for bounded continuous functions $f$.
    By definition of the limit, we need to prove that for any $epsilon>0$, there some $n_0=n_0(epsilon)$ such that for all $n>n_0$ the inequality $| E[f(X_n)] - E[f(X)]| < epsilon $ holds.




    1. As suggested in another answer, the first step is to show that if $X_n$ converge to $X$ in probability then $f(X_n)$ also converges in probability to $f(X)$ for any continuous $f$.

    2. Let $f$ be any continuous function bounded by $K$. Take any $epsilon>0$ and show that
      $$| E[f(X_n)] - E[f(X)]| le E[|f(X_n)] - E[f(X)|] le (epsilon/2) ; P(A_n^c) + K ; P(A_n)$$
      where $A_n$ is the event ${ |f(X_n)] - E[f(X)| > epsilon /2 }$.

    3. It remains to show that $P(A_n^c)le 1$ (obvious) and that for $n$ large enough, one has $P(A_n^c)le epsilon/(2 K)$ thanks to the convergence in probability established in 1.






    share|cite|improve this answer











    $endgroup$





















      0












      $begingroup$

      Chris J.'s answer more or less is correct, but you require almost sure convergence to be able to apply dominated convergence. Fortunately, convergence in probability implies almost sure convergence along a subsequence, and the proof more or less can proceed as desired.



      For more details, Kallenberg's Foundations of Modern Probability, First Edition, Lemma 3.7 is useful.






      share|cite|improve this answer









      $endgroup$









      • 1




        $begingroup$
        Or you could apply the bounded convergence theorem.
        $endgroup$
        – Calculon
        Mar 15 '15 at 11:33






      • 3




        $begingroup$
        Dominated convergence theorem also applies with convergence in probability.
        $endgroup$
        – perlman
        Oct 29 '17 at 0:26











      Your Answer





      StackExchange.ifUsing("editor", function () {
      return StackExchange.using("mathjaxEditing", function () {
      StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
      StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
      });
      });
      }, "mathjax-editing");

      StackExchange.ready(function() {
      var channelOptions = {
      tags: "".split(" "),
      id: "69"
      };
      initTagRenderer("".split(" "), "".split(" "), channelOptions);

      StackExchange.using("externalEditor", function() {
      // Have to fire editor after snippets, if snippets enabled
      if (StackExchange.settings.snippets.snippetsEnabled) {
      StackExchange.using("snippets", function() {
      createEditor();
      });
      }
      else {
      createEditor();
      }
      });

      function createEditor() {
      StackExchange.prepareEditor({
      heartbeatType: 'answer',
      autoActivateHeartbeat: false,
      convertImagesToLinks: true,
      noModals: true,
      showLowRepImageUploadWarning: true,
      reputationToPostImages: 10,
      bindNavPrevention: true,
      postfix: "",
      imageUploader: {
      brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
      contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
      allowUrls: true
      },
      noCode: true, onDemand: true,
      discardSelector: ".discard-answer"
      ,immediatelyShowMarkdownHelp:true
      });


      }
      });














      draft saved

      draft discarded


















      StackExchange.ready(
      function () {
      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f236955%2fconvergence-in-probability-implies-convergence-in-distribution%23new-answer', 'question_page');
      }
      );

      Post as a guest















      Required, but never shown

























      3 Answers
      3






      active

      oldest

      votes








      3 Answers
      3






      active

      oldest

      votes









      active

      oldest

      votes






      active

      oldest

      votes









      10












      $begingroup$

      A slicker proof (and more importantly one that generalizes) than the one in the wikipedia article is to observe that $X_n Longrightarrow X$ if and only if for all bounded continuous functions $f$ we have $E f(X_n) to E f(X)$. If you have convergence in probability then you can apply the dominated convergence theorem (recalling that $f$ is bounded and that for continuous functions $X_n to X$ in probability implies $f(X_n) to f(X)$ in probability) to conclude that $E |f(X_n) - f(X)| to 0$, which implies the result.






      share|cite|improve this answer









      $endgroup$


















        10












        $begingroup$

        A slicker proof (and more importantly one that generalizes) than the one in the wikipedia article is to observe that $X_n Longrightarrow X$ if and only if for all bounded continuous functions $f$ we have $E f(X_n) to E f(X)$. If you have convergence in probability then you can apply the dominated convergence theorem (recalling that $f$ is bounded and that for continuous functions $X_n to X$ in probability implies $f(X_n) to f(X)$ in probability) to conclude that $E |f(X_n) - f(X)| to 0$, which implies the result.






        share|cite|improve this answer









        $endgroup$
















          10












          10








          10





          $begingroup$

          A slicker proof (and more importantly one that generalizes) than the one in the wikipedia article is to observe that $X_n Longrightarrow X$ if and only if for all bounded continuous functions $f$ we have $E f(X_n) to E f(X)$. If you have convergence in probability then you can apply the dominated convergence theorem (recalling that $f$ is bounded and that for continuous functions $X_n to X$ in probability implies $f(X_n) to f(X)$ in probability) to conclude that $E |f(X_n) - f(X)| to 0$, which implies the result.






          share|cite|improve this answer









          $endgroup$



          A slicker proof (and more importantly one that generalizes) than the one in the wikipedia article is to observe that $X_n Longrightarrow X$ if and only if for all bounded continuous functions $f$ we have $E f(X_n) to E f(X)$. If you have convergence in probability then you can apply the dominated convergence theorem (recalling that $f$ is bounded and that for continuous functions $X_n to X$ in probability implies $f(X_n) to f(X)$ in probability) to conclude that $E |f(X_n) - f(X)| to 0$, which implies the result.







          share|cite|improve this answer












          share|cite|improve this answer



          share|cite|improve this answer










          answered Nov 14 '12 at 20:55









          Chris JanjigianChris Janjigian

          4,97841935




          4,97841935























              1












              $begingroup$

              Here is an answer that does not rely on dominated convergence.



              To prove convergence in distribution, we only need to establish that $E[f(X_n)]$ converges to $E[f(X)]$ for bounded continuous functions $f$.
              By definition of the limit, we need to prove that for any $epsilon>0$, there some $n_0=n_0(epsilon)$ such that for all $n>n_0$ the inequality $| E[f(X_n)] - E[f(X)]| < epsilon $ holds.




              1. As suggested in another answer, the first step is to show that if $X_n$ converge to $X$ in probability then $f(X_n)$ also converges in probability to $f(X)$ for any continuous $f$.

              2. Let $f$ be any continuous function bounded by $K$. Take any $epsilon>0$ and show that
                $$| E[f(X_n)] - E[f(X)]| le E[|f(X_n)] - E[f(X)|] le (epsilon/2) ; P(A_n^c) + K ; P(A_n)$$
                where $A_n$ is the event ${ |f(X_n)] - E[f(X)| > epsilon /2 }$.

              3. It remains to show that $P(A_n^c)le 1$ (obvious) and that for $n$ large enough, one has $P(A_n^c)le epsilon/(2 K)$ thanks to the convergence in probability established in 1.






              share|cite|improve this answer











              $endgroup$


















                1












                $begingroup$

                Here is an answer that does not rely on dominated convergence.



                To prove convergence in distribution, we only need to establish that $E[f(X_n)]$ converges to $E[f(X)]$ for bounded continuous functions $f$.
                By definition of the limit, we need to prove that for any $epsilon>0$, there some $n_0=n_0(epsilon)$ such that for all $n>n_0$ the inequality $| E[f(X_n)] - E[f(X)]| < epsilon $ holds.




                1. As suggested in another answer, the first step is to show that if $X_n$ converge to $X$ in probability then $f(X_n)$ also converges in probability to $f(X)$ for any continuous $f$.

                2. Let $f$ be any continuous function bounded by $K$. Take any $epsilon>0$ and show that
                  $$| E[f(X_n)] - E[f(X)]| le E[|f(X_n)] - E[f(X)|] le (epsilon/2) ; P(A_n^c) + K ; P(A_n)$$
                  where $A_n$ is the event ${ |f(X_n)] - E[f(X)| > epsilon /2 }$.

                3. It remains to show that $P(A_n^c)le 1$ (obvious) and that for $n$ large enough, one has $P(A_n^c)le epsilon/(2 K)$ thanks to the convergence in probability established in 1.






                share|cite|improve this answer











                $endgroup$
















                  1












                  1








                  1





                  $begingroup$

                  Here is an answer that does not rely on dominated convergence.



                  To prove convergence in distribution, we only need to establish that $E[f(X_n)]$ converges to $E[f(X)]$ for bounded continuous functions $f$.
                  By definition of the limit, we need to prove that for any $epsilon>0$, there some $n_0=n_0(epsilon)$ such that for all $n>n_0$ the inequality $| E[f(X_n)] - E[f(X)]| < epsilon $ holds.




                  1. As suggested in another answer, the first step is to show that if $X_n$ converge to $X$ in probability then $f(X_n)$ also converges in probability to $f(X)$ for any continuous $f$.

                  2. Let $f$ be any continuous function bounded by $K$. Take any $epsilon>0$ and show that
                    $$| E[f(X_n)] - E[f(X)]| le E[|f(X_n)] - E[f(X)|] le (epsilon/2) ; P(A_n^c) + K ; P(A_n)$$
                    where $A_n$ is the event ${ |f(X_n)] - E[f(X)| > epsilon /2 }$.

                  3. It remains to show that $P(A_n^c)le 1$ (obvious) and that for $n$ large enough, one has $P(A_n^c)le epsilon/(2 K)$ thanks to the convergence in probability established in 1.






                  share|cite|improve this answer











                  $endgroup$



                  Here is an answer that does not rely on dominated convergence.



                  To prove convergence in distribution, we only need to establish that $E[f(X_n)]$ converges to $E[f(X)]$ for bounded continuous functions $f$.
                  By definition of the limit, we need to prove that for any $epsilon>0$, there some $n_0=n_0(epsilon)$ such that for all $n>n_0$ the inequality $| E[f(X_n)] - E[f(X)]| < epsilon $ holds.




                  1. As suggested in another answer, the first step is to show that if $X_n$ converge to $X$ in probability then $f(X_n)$ also converges in probability to $f(X)$ for any continuous $f$.

                  2. Let $f$ be any continuous function bounded by $K$. Take any $epsilon>0$ and show that
                    $$| E[f(X_n)] - E[f(X)]| le E[|f(X_n)] - E[f(X)|] le (epsilon/2) ; P(A_n^c) + K ; P(A_n)$$
                    where $A_n$ is the event ${ |f(X_n)] - E[f(X)| > epsilon /2 }$.

                  3. It remains to show that $P(A_n^c)le 1$ (obvious) and that for $n$ large enough, one has $P(A_n^c)le epsilon/(2 K)$ thanks to the convergence in probability established in 1.







                  share|cite|improve this answer














                  share|cite|improve this answer



                  share|cite|improve this answer








                  edited Nov 17 '17 at 18:21

























                  answered Sep 26 '17 at 4:11









                  jlewkjlewk

                  1115




                  1115























                      0












                      $begingroup$

                      Chris J.'s answer more or less is correct, but you require almost sure convergence to be able to apply dominated convergence. Fortunately, convergence in probability implies almost sure convergence along a subsequence, and the proof more or less can proceed as desired.



                      For more details, Kallenberg's Foundations of Modern Probability, First Edition, Lemma 3.7 is useful.






                      share|cite|improve this answer









                      $endgroup$









                      • 1




                        $begingroup$
                        Or you could apply the bounded convergence theorem.
                        $endgroup$
                        – Calculon
                        Mar 15 '15 at 11:33






                      • 3




                        $begingroup$
                        Dominated convergence theorem also applies with convergence in probability.
                        $endgroup$
                        – perlman
                        Oct 29 '17 at 0:26
















                      0












                      $begingroup$

                      Chris J.'s answer more or less is correct, but you require almost sure convergence to be able to apply dominated convergence. Fortunately, convergence in probability implies almost sure convergence along a subsequence, and the proof more or less can proceed as desired.



                      For more details, Kallenberg's Foundations of Modern Probability, First Edition, Lemma 3.7 is useful.






                      share|cite|improve this answer









                      $endgroup$









                      • 1




                        $begingroup$
                        Or you could apply the bounded convergence theorem.
                        $endgroup$
                        – Calculon
                        Mar 15 '15 at 11:33






                      • 3




                        $begingroup$
                        Dominated convergence theorem also applies with convergence in probability.
                        $endgroup$
                        – perlman
                        Oct 29 '17 at 0:26














                      0












                      0








                      0





                      $begingroup$

                      Chris J.'s answer more or less is correct, but you require almost sure convergence to be able to apply dominated convergence. Fortunately, convergence in probability implies almost sure convergence along a subsequence, and the proof more or less can proceed as desired.



                      For more details, Kallenberg's Foundations of Modern Probability, First Edition, Lemma 3.7 is useful.






                      share|cite|improve this answer









                      $endgroup$



                      Chris J.'s answer more or less is correct, but you require almost sure convergence to be able to apply dominated convergence. Fortunately, convergence in probability implies almost sure convergence along a subsequence, and the proof more or less can proceed as desired.



                      For more details, Kallenberg's Foundations of Modern Probability, First Edition, Lemma 3.7 is useful.







                      share|cite|improve this answer












                      share|cite|improve this answer



                      share|cite|improve this answer










                      answered Feb 16 '14 at 3:14









                      Roy D.Roy D.

                      404211




                      404211








                      • 1




                        $begingroup$
                        Or you could apply the bounded convergence theorem.
                        $endgroup$
                        – Calculon
                        Mar 15 '15 at 11:33






                      • 3




                        $begingroup$
                        Dominated convergence theorem also applies with convergence in probability.
                        $endgroup$
                        – perlman
                        Oct 29 '17 at 0:26














                      • 1




                        $begingroup$
                        Or you could apply the bounded convergence theorem.
                        $endgroup$
                        – Calculon
                        Mar 15 '15 at 11:33






                      • 3




                        $begingroup$
                        Dominated convergence theorem also applies with convergence in probability.
                        $endgroup$
                        – perlman
                        Oct 29 '17 at 0:26








                      1




                      1




                      $begingroup$
                      Or you could apply the bounded convergence theorem.
                      $endgroup$
                      – Calculon
                      Mar 15 '15 at 11:33




                      $begingroup$
                      Or you could apply the bounded convergence theorem.
                      $endgroup$
                      – Calculon
                      Mar 15 '15 at 11:33




                      3




                      3




                      $begingroup$
                      Dominated convergence theorem also applies with convergence in probability.
                      $endgroup$
                      – perlman
                      Oct 29 '17 at 0:26




                      $begingroup$
                      Dominated convergence theorem also applies with convergence in probability.
                      $endgroup$
                      – perlman
                      Oct 29 '17 at 0:26


















                      draft saved

                      draft discarded




















































                      Thanks for contributing an answer to Mathematics Stack Exchange!


                      • Please be sure to answer the question. Provide details and share your research!

                      But avoid



                      • Asking for help, clarification, or responding to other answers.

                      • Making statements based on opinion; back them up with references or personal experience.


                      Use MathJax to format equations. MathJax reference.


                      To learn more, see our tips on writing great answers.




                      draft saved


                      draft discarded














                      StackExchange.ready(
                      function () {
                      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f236955%2fconvergence-in-probability-implies-convergence-in-distribution%23new-answer', 'question_page');
                      }
                      );

                      Post as a guest















                      Required, but never shown





















































                      Required, but never shown














                      Required, but never shown












                      Required, but never shown







                      Required, but never shown

































                      Required, but never shown














                      Required, but never shown












                      Required, but never shown







                      Required, but never shown







                      Popular posts from this blog

                      MongoDB - Not Authorized To Execute Command

                      How to fix TextFormField cause rebuild widget in Flutter

                      in spring boot 2.1 many test slices are not allowed anymore due to multiple @BootstrapWith