Asymptotic Statistics: Convergence in distribution to a tight CDF implies convergence in probability to a...












0












$begingroup$


I'm taking a course in asymptotic statistics, and I'm having trouble with how to start proving the following statement. I learn best from examples, rather than just reading the theorems, however I can't find an example that seems similar. Any and all help is appreciated!



$$
left.
begin{matrix}
k_n left( Y_n -c right) xrightarrow{D} H \
k_n to infty
end{matrix}
right} implies Y_n xrightarrow{P} c
$$

There is the additional hint: You may assume without loss of generality that $H$ is the cumulative distribution function of a random variable $V$ which is tight meaning that, given any $epsilon > 0$, there exists a finite number $M$ such that $P(|V | leq M) geq 1 − epsilon/2$.





When I look at this, I instinctively think of the Central Limit Theorem with $k_n = sqrt{n}$, $Y_n = bar{X}_n$, $c=mu$ and $H$ the CDF some normally distributed r.v. $V$. I understand the concept of tightness which implies that none of the PMF escapes to infinity, so that makes me think I'm incorrect in assuming $V$ is normal.










share|cite|improve this question









$endgroup$

















    0












    $begingroup$


    I'm taking a course in asymptotic statistics, and I'm having trouble with how to start proving the following statement. I learn best from examples, rather than just reading the theorems, however I can't find an example that seems similar. Any and all help is appreciated!



    $$
    left.
    begin{matrix}
    k_n left( Y_n -c right) xrightarrow{D} H \
    k_n to infty
    end{matrix}
    right} implies Y_n xrightarrow{P} c
    $$

    There is the additional hint: You may assume without loss of generality that $H$ is the cumulative distribution function of a random variable $V$ which is tight meaning that, given any $epsilon > 0$, there exists a finite number $M$ such that $P(|V | leq M) geq 1 − epsilon/2$.





    When I look at this, I instinctively think of the Central Limit Theorem with $k_n = sqrt{n}$, $Y_n = bar{X}_n$, $c=mu$ and $H$ the CDF some normally distributed r.v. $V$. I understand the concept of tightness which implies that none of the PMF escapes to infinity, so that makes me think I'm incorrect in assuming $V$ is normal.










    share|cite|improve this question









    $endgroup$















      0












      0








      0





      $begingroup$


      I'm taking a course in asymptotic statistics, and I'm having trouble with how to start proving the following statement. I learn best from examples, rather than just reading the theorems, however I can't find an example that seems similar. Any and all help is appreciated!



      $$
      left.
      begin{matrix}
      k_n left( Y_n -c right) xrightarrow{D} H \
      k_n to infty
      end{matrix}
      right} implies Y_n xrightarrow{P} c
      $$

      There is the additional hint: You may assume without loss of generality that $H$ is the cumulative distribution function of a random variable $V$ which is tight meaning that, given any $epsilon > 0$, there exists a finite number $M$ such that $P(|V | leq M) geq 1 − epsilon/2$.





      When I look at this, I instinctively think of the Central Limit Theorem with $k_n = sqrt{n}$, $Y_n = bar{X}_n$, $c=mu$ and $H$ the CDF some normally distributed r.v. $V$. I understand the concept of tightness which implies that none of the PMF escapes to infinity, so that makes me think I'm incorrect in assuming $V$ is normal.










      share|cite|improve this question









      $endgroup$




      I'm taking a course in asymptotic statistics, and I'm having trouble with how to start proving the following statement. I learn best from examples, rather than just reading the theorems, however I can't find an example that seems similar. Any and all help is appreciated!



      $$
      left.
      begin{matrix}
      k_n left( Y_n -c right) xrightarrow{D} H \
      k_n to infty
      end{matrix}
      right} implies Y_n xrightarrow{P} c
      $$

      There is the additional hint: You may assume without loss of generality that $H$ is the cumulative distribution function of a random variable $V$ which is tight meaning that, given any $epsilon > 0$, there exists a finite number $M$ such that $P(|V | leq M) geq 1 − epsilon/2$.





      When I look at this, I instinctively think of the Central Limit Theorem with $k_n = sqrt{n}$, $Y_n = bar{X}_n$, $c=mu$ and $H$ the CDF some normally distributed r.v. $V$. I understand the concept of tightness which implies that none of the PMF escapes to infinity, so that makes me think I'm incorrect in assuming $V$ is normal.







      statistics asymptotics weak-convergence






      share|cite|improve this question













      share|cite|improve this question











      share|cite|improve this question




      share|cite|improve this question










      asked Jan 24 at 11:57









      StephanieStephanie

      83




      83






















          1 Answer
          1






          active

          oldest

          votes


















          1












          $begingroup$

          Let $epsilon >0$. Let $M$ be as given in the hint. Without loss of generality we may assume that $H$ is continuous at $pm M$. If $k_n >frac M {epsilon}$ then $P{|Y_n-c| >epsilon} =P{|k_n(Y_n-c)| >k_n epsilon}leq P{|k_n(Y_n-c)| >M} to P{|V| >M} <epsilon /2$.






          share|cite|improve this answer









          $endgroup$













          • $begingroup$
            Thanks for your help Kavi! The $to$ implication, does that stem from: as $epsilon$ is arbitrary and $M$ is fixed and finite, $epsilonto 0$ as $k_n to infty$? Or is it an implication of the convergence in distribution? Usually that requires some type of $n$ involved in the terms, is that right? Should $epsilon$ be defined as some type of inverse on $n$ to achieve this?
            $endgroup$
            – Stephanie
            Jan 24 at 13:28













          Your Answer





          StackExchange.ifUsing("editor", function () {
          return StackExchange.using("mathjaxEditing", function () {
          StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
          StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
          });
          });
          }, "mathjax-editing");

          StackExchange.ready(function() {
          var channelOptions = {
          tags: "".split(" "),
          id: "69"
          };
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function() {
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled) {
          StackExchange.using("snippets", function() {
          createEditor();
          });
          }
          else {
          createEditor();
          }
          });

          function createEditor() {
          StackExchange.prepareEditor({
          heartbeatType: 'answer',
          autoActivateHeartbeat: false,
          convertImagesToLinks: true,
          noModals: true,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: 10,
          bindNavPrevention: true,
          postfix: "",
          imageUploader: {
          brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
          contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
          allowUrls: true
          },
          noCode: true, onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          });


          }
          });














          draft saved

          draft discarded


















          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3085789%2fasymptotic-statistics-convergence-in-distribution-to-a-tight-cdf-implies-conver%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown

























          1 Answer
          1






          active

          oldest

          votes








          1 Answer
          1






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes









          1












          $begingroup$

          Let $epsilon >0$. Let $M$ be as given in the hint. Without loss of generality we may assume that $H$ is continuous at $pm M$. If $k_n >frac M {epsilon}$ then $P{|Y_n-c| >epsilon} =P{|k_n(Y_n-c)| >k_n epsilon}leq P{|k_n(Y_n-c)| >M} to P{|V| >M} <epsilon /2$.






          share|cite|improve this answer









          $endgroup$













          • $begingroup$
            Thanks for your help Kavi! The $to$ implication, does that stem from: as $epsilon$ is arbitrary and $M$ is fixed and finite, $epsilonto 0$ as $k_n to infty$? Or is it an implication of the convergence in distribution? Usually that requires some type of $n$ involved in the terms, is that right? Should $epsilon$ be defined as some type of inverse on $n$ to achieve this?
            $endgroup$
            – Stephanie
            Jan 24 at 13:28


















          1












          $begingroup$

          Let $epsilon >0$. Let $M$ be as given in the hint. Without loss of generality we may assume that $H$ is continuous at $pm M$. If $k_n >frac M {epsilon}$ then $P{|Y_n-c| >epsilon} =P{|k_n(Y_n-c)| >k_n epsilon}leq P{|k_n(Y_n-c)| >M} to P{|V| >M} <epsilon /2$.






          share|cite|improve this answer









          $endgroup$













          • $begingroup$
            Thanks for your help Kavi! The $to$ implication, does that stem from: as $epsilon$ is arbitrary and $M$ is fixed and finite, $epsilonto 0$ as $k_n to infty$? Or is it an implication of the convergence in distribution? Usually that requires some type of $n$ involved in the terms, is that right? Should $epsilon$ be defined as some type of inverse on $n$ to achieve this?
            $endgroup$
            – Stephanie
            Jan 24 at 13:28
















          1












          1








          1





          $begingroup$

          Let $epsilon >0$. Let $M$ be as given in the hint. Without loss of generality we may assume that $H$ is continuous at $pm M$. If $k_n >frac M {epsilon}$ then $P{|Y_n-c| >epsilon} =P{|k_n(Y_n-c)| >k_n epsilon}leq P{|k_n(Y_n-c)| >M} to P{|V| >M} <epsilon /2$.






          share|cite|improve this answer









          $endgroup$



          Let $epsilon >0$. Let $M$ be as given in the hint. Without loss of generality we may assume that $H$ is continuous at $pm M$. If $k_n >frac M {epsilon}$ then $P{|Y_n-c| >epsilon} =P{|k_n(Y_n-c)| >k_n epsilon}leq P{|k_n(Y_n-c)| >M} to P{|V| >M} <epsilon /2$.







          share|cite|improve this answer












          share|cite|improve this answer



          share|cite|improve this answer










          answered Jan 24 at 12:11









          Kavi Rama MurthyKavi Rama Murthy

          67.6k53067




          67.6k53067












          • $begingroup$
            Thanks for your help Kavi! The $to$ implication, does that stem from: as $epsilon$ is arbitrary and $M$ is fixed and finite, $epsilonto 0$ as $k_n to infty$? Or is it an implication of the convergence in distribution? Usually that requires some type of $n$ involved in the terms, is that right? Should $epsilon$ be defined as some type of inverse on $n$ to achieve this?
            $endgroup$
            – Stephanie
            Jan 24 at 13:28




















          • $begingroup$
            Thanks for your help Kavi! The $to$ implication, does that stem from: as $epsilon$ is arbitrary and $M$ is fixed and finite, $epsilonto 0$ as $k_n to infty$? Or is it an implication of the convergence in distribution? Usually that requires some type of $n$ involved in the terms, is that right? Should $epsilon$ be defined as some type of inverse on $n$ to achieve this?
            $endgroup$
            – Stephanie
            Jan 24 at 13:28


















          $begingroup$
          Thanks for your help Kavi! The $to$ implication, does that stem from: as $epsilon$ is arbitrary and $M$ is fixed and finite, $epsilonto 0$ as $k_n to infty$? Or is it an implication of the convergence in distribution? Usually that requires some type of $n$ involved in the terms, is that right? Should $epsilon$ be defined as some type of inverse on $n$ to achieve this?
          $endgroup$
          – Stephanie
          Jan 24 at 13:28






          $begingroup$
          Thanks for your help Kavi! The $to$ implication, does that stem from: as $epsilon$ is arbitrary and $M$ is fixed and finite, $epsilonto 0$ as $k_n to infty$? Or is it an implication of the convergence in distribution? Usually that requires some type of $n$ involved in the terms, is that right? Should $epsilon$ be defined as some type of inverse on $n$ to achieve this?
          $endgroup$
          – Stephanie
          Jan 24 at 13:28




















          draft saved

          draft discarded




















































          Thanks for contributing an answer to Mathematics Stack Exchange!


          • Please be sure to answer the question. Provide details and share your research!

          But avoid



          • Asking for help, clarification, or responding to other answers.

          • Making statements based on opinion; back them up with references or personal experience.


          Use MathJax to format equations. MathJax reference.


          To learn more, see our tips on writing great answers.




          draft saved


          draft discarded














          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3085789%2fasymptotic-statistics-convergence-in-distribution-to-a-tight-cdf-implies-conver%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown





















































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown

































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown







          Popular posts from this blog

          Can a sorcerer learn a 5th-level spell early by creating spell slots using the Font of Magic feature?

          Does disintegrating a polymorphed enemy still kill it after the 2018 errata?

          A Topological Invariant for $pi_3(U(n))$