Is estimator $dfrac{bar X}{1+bar X}$ of $theta$ is consistent?












2












$begingroup$



Let $X_1,X_2,X_3.....X_n$ be a random sample from a population X
having the probability density function
$$ f(x;theta) = begin{cases} theta x^{theta -1} & text{if $0 le
xle$ 1} \0 &text{otherwise} end{cases}$$




Is the estimator $hattheta = $ $dfrac{bar X}{1-bar X}$ of $theta$ a Consistent estimator of $theta?$



I am trying to find $E(X) $ here to see if it is equal to $theta$ asymptotically but I am not sure how to find expectation here I am not familiar with finding $E(X)$ of fractions.










share|cite|improve this question











$endgroup$

















    2












    $begingroup$



    Let $X_1,X_2,X_3.....X_n$ be a random sample from a population X
    having the probability density function
    $$ f(x;theta) = begin{cases} theta x^{theta -1} & text{if $0 le
    xle$ 1} \0 &text{otherwise} end{cases}$$




    Is the estimator $hattheta = $ $dfrac{bar X}{1-bar X}$ of $theta$ a Consistent estimator of $theta?$



    I am trying to find $E(X) $ here to see if it is equal to $theta$ asymptotically but I am not sure how to find expectation here I am not familiar with finding $E(X)$ of fractions.










    share|cite|improve this question











    $endgroup$















      2












      2








      2


      3



      $begingroup$



      Let $X_1,X_2,X_3.....X_n$ be a random sample from a population X
      having the probability density function
      $$ f(x;theta) = begin{cases} theta x^{theta -1} & text{if $0 le
      xle$ 1} \0 &text{otherwise} end{cases}$$




      Is the estimator $hattheta = $ $dfrac{bar X}{1-bar X}$ of $theta$ a Consistent estimator of $theta?$



      I am trying to find $E(X) $ here to see if it is equal to $theta$ asymptotically but I am not sure how to find expectation here I am not familiar with finding $E(X)$ of fractions.










      share|cite|improve this question











      $endgroup$





      Let $X_1,X_2,X_3.....X_n$ be a random sample from a population X
      having the probability density function
      $$ f(x;theta) = begin{cases} theta x^{theta -1} & text{if $0 le
      xle$ 1} \0 &text{otherwise} end{cases}$$




      Is the estimator $hattheta = $ $dfrac{bar X}{1-bar X}$ of $theta$ a Consistent estimator of $theta?$



      I am trying to find $E(X) $ here to see if it is equal to $theta$ asymptotically but I am not sure how to find expectation here I am not familiar with finding $E(X)$ of fractions.







      statistical-inference parameter-estimation






      share|cite|improve this question















      share|cite|improve this question













      share|cite|improve this question




      share|cite|improve this question








      edited Jan 11 at 7:47









      Ahmad Bazzi

      8,0812824




      8,0812824










      asked Jan 11 at 5:22









      Daman deepDaman deep

      756418




      756418






















          2 Answers
          2






          active

          oldest

          votes


















          3












          $begingroup$


          I am trying to find $E(X) $ here to see if it is equal to $theta$ asymptotically




          It's not $E(X)$ that should be equal to $theta$ asymptotically, it's $hat{theta}$.



          Let's find $E(X)$ as you suggest $$E(X) = int_0^1 xf(x;theta) dx = int_0^1 theta x^{theta} = frac{theta}{theta+1} $$



          Now let's see in the asymptotic regime how $hat{theta}$ behaves,
          $$hat{theta} = dfrac{bar X}{1-bar X} rightarrow frac{E(X)}{1-E(X)} = frac{frac{theta}{theta+1} }{1-frac{theta}{theta+1} } = frac{theta}{theta+1} frac{theta+1}{1} = frac{theta}{1} = theta $$
          So, what can you say about $hat{theta}$ ?






          share|cite|improve this answer











          $endgroup$









          • 1




            $begingroup$
            Hey @Damandeep, yes it is not asymptotically consistent .. what does your textbook say?
            $endgroup$
            – Ahmad Bazzi
            Jan 11 at 7:38






          • 3




            $begingroup$
            Yeah because they got another estimator which is $hat{theta} = frac{bar{X}}{1- bar{X}}$ .. in your question you have a plus sign instead of a minus in the denominator .. makes a WHOLEEE lot of difference.
            $endgroup$
            – Ahmad Bazzi
            Jan 11 at 7:45








          • 2




            $begingroup$
            no worries, i have edited your question and my answer .. please see now :) @Damandeep
            $endgroup$
            – Ahmad Bazzi
            Jan 11 at 7:47








          • 2




            $begingroup$
            no need @Damandeep .. question is still the same ;) .. btw if you found the answer useful you can mark it as correct as well
            $endgroup$
            – Ahmad Bazzi
            Jan 11 at 7:51






          • 1




            $begingroup$
            An estimator is consistent if, as the sample size increases, the estimates "converge" to the true value of the parameter being estimated. That's it. What you mention ($V(hat{theta}) = 0$) is a consequence of certain types of consistent estimators.
            $endgroup$
            – Ahmad Bazzi
            Jan 11 at 8:44



















          1












          $begingroup$

          I suppose the answer is no. Perhaps we may do as follows.



          Denote
          begin{align}
          S_n&=frac{1}{n}sum_{j=1}^nX_j,\
          T_n&=frac{S_n}{1+S_n}.
          end{align}

          Let us use the strong law of large number (SLLN) and the dominated convergence theorem (DCT) to go for the consistence.



          For one thing, each $X_jinleft[0,1right]$ implies that $S_ninleft[0,1right]$. Consequently, $T_nleft[0,1/2right]$, which implies that $left|T_nright|le 1/2$ for all $ninmathbb{N}$. This upper bound $1/2$ is obviously integrable under the probability measure, i.e., $mathbb{E}(1/2)=1/2<infty$.



          For another, by SLLN,
          $$
          lim_{ntoinfty}S_n=mathbb{E}X_1=frac{theta}{1+theta},
          $$

          which thus leads to
          $$
          lim_{ntoinfty}T_n=frac{lim_{ntoinfty}S_n}{1+lim_{ntoinfty}S_n}=frac{theta}{1+2theta}.
          $$



          Combine the above two facts, and DCT applies. This gives
          $$
          lim_{ntoinfty}mathbb{E}T_n=mathbb{E}left(lim_{ntoinfty}T_nright)=mathbb{E}frac{theta}{1+2theta}=frac{theta}{1+2theta}netheta.
          $$

          Therefore, $T_n$ is not a consistent estimator of $theta$.



          However, I suppose there might be a typo in $T_n$. In fact, if we consider another estimator
          $$
          U_n=frac{S_n}{1-S_n},
          $$

          the answer would be yes.



          The scope of its proof is essentially the same as above, although some tricks are necessary. This is because DCT would fail as $U_n$ is no longer bounded from above (more precisely, it is not trivial to figure out some $V$ such that $mathbb{E}V<infty$ and that $left|U_nright|le V$).



          To help facilitate our proof, define
          $$
          U_{mn}=U_ncdot 1_{left{U_nle mright}}.
          $$

          Note that $U_nge 1$ is guaranteed because $S_ninleft[0,1right]$. Besides, note that $U_{mn}$ is monotone in $m$, i.e.,
          $$
          U_{mn}le U_{m+1,n}.
          $$

          Thanks to these two facts, the monotone convergence theorem (MCT) applies. It gives
          $$
          mathbb{E}left(lim_{mtoinfty}U_{mn}right)=lim_{mtoinfty}mathbb{E}U_{mn}.
          $$

          Consequently,
          $$
          lim_{ntoinfty}mathbb{E}U_n=lim_{ntoinfty}mathbb{E}left(lim_{mtoinfty}U_{mn}right)=lim_{ntoinfty}lim_{mtoinfty}mathbb{E}U_{mn}.
          $$

          Further, for each fixed $m$, $mathbb{E}U_{mn}$ converges as $ntoinfty$ by DCT, and for each fixed $n$, $mathbb{E}U_{mn}$ converges uniformly as $mtoinfty$ due to the uniform cutoff. These facts inspires that
          $$
          lim_{ntoinfty}lim_{mtoinfty}mathbb{E}U_{mn}=lim_{mtoinfty}lim_{ntoinfty}mathbb{E}U_{mn}.
          $$



          Thanks to all these arguments, we may safely conclude that
          $$
          lim_{ntoinfty}mathbb{E}U_n=lim_{mtoinfty}lim_{ntoinfty}mathbb{E}U_{mn}=lim_{mtoinfty}mathbb{E}left(lim_{ntoinfty}U_{mn}right)=cdots=theta.
          $$






          share|cite|improve this answer









          $endgroup$













          • $begingroup$
            Thanks for your effort
            $endgroup$
            – Daman deep
            Jan 11 at 7:49










          • $begingroup$
            @Damandeep: No problem :-)
            $endgroup$
            – hypernova
            Jan 11 at 8:06











          Your Answer





          StackExchange.ifUsing("editor", function () {
          return StackExchange.using("mathjaxEditing", function () {
          StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
          StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
          });
          });
          }, "mathjax-editing");

          StackExchange.ready(function() {
          var channelOptions = {
          tags: "".split(" "),
          id: "69"
          };
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function() {
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled) {
          StackExchange.using("snippets", function() {
          createEditor();
          });
          }
          else {
          createEditor();
          }
          });

          function createEditor() {
          StackExchange.prepareEditor({
          heartbeatType: 'answer',
          autoActivateHeartbeat: false,
          convertImagesToLinks: true,
          noModals: true,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: 10,
          bindNavPrevention: true,
          postfix: "",
          imageUploader: {
          brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
          contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
          allowUrls: true
          },
          noCode: true, onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          });


          }
          });














          draft saved

          draft discarded


















          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3069502%2fis-estimator-dfrac-bar-x1-bar-x-of-theta-is-consistent%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown

























          2 Answers
          2






          active

          oldest

          votes








          2 Answers
          2






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes









          3












          $begingroup$


          I am trying to find $E(X) $ here to see if it is equal to $theta$ asymptotically




          It's not $E(X)$ that should be equal to $theta$ asymptotically, it's $hat{theta}$.



          Let's find $E(X)$ as you suggest $$E(X) = int_0^1 xf(x;theta) dx = int_0^1 theta x^{theta} = frac{theta}{theta+1} $$



          Now let's see in the asymptotic regime how $hat{theta}$ behaves,
          $$hat{theta} = dfrac{bar X}{1-bar X} rightarrow frac{E(X)}{1-E(X)} = frac{frac{theta}{theta+1} }{1-frac{theta}{theta+1} } = frac{theta}{theta+1} frac{theta+1}{1} = frac{theta}{1} = theta $$
          So, what can you say about $hat{theta}$ ?






          share|cite|improve this answer











          $endgroup$









          • 1




            $begingroup$
            Hey @Damandeep, yes it is not asymptotically consistent .. what does your textbook say?
            $endgroup$
            – Ahmad Bazzi
            Jan 11 at 7:38






          • 3




            $begingroup$
            Yeah because they got another estimator which is $hat{theta} = frac{bar{X}}{1- bar{X}}$ .. in your question you have a plus sign instead of a minus in the denominator .. makes a WHOLEEE lot of difference.
            $endgroup$
            – Ahmad Bazzi
            Jan 11 at 7:45








          • 2




            $begingroup$
            no worries, i have edited your question and my answer .. please see now :) @Damandeep
            $endgroup$
            – Ahmad Bazzi
            Jan 11 at 7:47








          • 2




            $begingroup$
            no need @Damandeep .. question is still the same ;) .. btw if you found the answer useful you can mark it as correct as well
            $endgroup$
            – Ahmad Bazzi
            Jan 11 at 7:51






          • 1




            $begingroup$
            An estimator is consistent if, as the sample size increases, the estimates "converge" to the true value of the parameter being estimated. That's it. What you mention ($V(hat{theta}) = 0$) is a consequence of certain types of consistent estimators.
            $endgroup$
            – Ahmad Bazzi
            Jan 11 at 8:44
















          3












          $begingroup$


          I am trying to find $E(X) $ here to see if it is equal to $theta$ asymptotically




          It's not $E(X)$ that should be equal to $theta$ asymptotically, it's $hat{theta}$.



          Let's find $E(X)$ as you suggest $$E(X) = int_0^1 xf(x;theta) dx = int_0^1 theta x^{theta} = frac{theta}{theta+1} $$



          Now let's see in the asymptotic regime how $hat{theta}$ behaves,
          $$hat{theta} = dfrac{bar X}{1-bar X} rightarrow frac{E(X)}{1-E(X)} = frac{frac{theta}{theta+1} }{1-frac{theta}{theta+1} } = frac{theta}{theta+1} frac{theta+1}{1} = frac{theta}{1} = theta $$
          So, what can you say about $hat{theta}$ ?






          share|cite|improve this answer











          $endgroup$









          • 1




            $begingroup$
            Hey @Damandeep, yes it is not asymptotically consistent .. what does your textbook say?
            $endgroup$
            – Ahmad Bazzi
            Jan 11 at 7:38






          • 3




            $begingroup$
            Yeah because they got another estimator which is $hat{theta} = frac{bar{X}}{1- bar{X}}$ .. in your question you have a plus sign instead of a minus in the denominator .. makes a WHOLEEE lot of difference.
            $endgroup$
            – Ahmad Bazzi
            Jan 11 at 7:45








          • 2




            $begingroup$
            no worries, i have edited your question and my answer .. please see now :) @Damandeep
            $endgroup$
            – Ahmad Bazzi
            Jan 11 at 7:47








          • 2




            $begingroup$
            no need @Damandeep .. question is still the same ;) .. btw if you found the answer useful you can mark it as correct as well
            $endgroup$
            – Ahmad Bazzi
            Jan 11 at 7:51






          • 1




            $begingroup$
            An estimator is consistent if, as the sample size increases, the estimates "converge" to the true value of the parameter being estimated. That's it. What you mention ($V(hat{theta}) = 0$) is a consequence of certain types of consistent estimators.
            $endgroup$
            – Ahmad Bazzi
            Jan 11 at 8:44














          3












          3








          3





          $begingroup$


          I am trying to find $E(X) $ here to see if it is equal to $theta$ asymptotically




          It's not $E(X)$ that should be equal to $theta$ asymptotically, it's $hat{theta}$.



          Let's find $E(X)$ as you suggest $$E(X) = int_0^1 xf(x;theta) dx = int_0^1 theta x^{theta} = frac{theta}{theta+1} $$



          Now let's see in the asymptotic regime how $hat{theta}$ behaves,
          $$hat{theta} = dfrac{bar X}{1-bar X} rightarrow frac{E(X)}{1-E(X)} = frac{frac{theta}{theta+1} }{1-frac{theta}{theta+1} } = frac{theta}{theta+1} frac{theta+1}{1} = frac{theta}{1} = theta $$
          So, what can you say about $hat{theta}$ ?






          share|cite|improve this answer











          $endgroup$




          I am trying to find $E(X) $ here to see if it is equal to $theta$ asymptotically




          It's not $E(X)$ that should be equal to $theta$ asymptotically, it's $hat{theta}$.



          Let's find $E(X)$ as you suggest $$E(X) = int_0^1 xf(x;theta) dx = int_0^1 theta x^{theta} = frac{theta}{theta+1} $$



          Now let's see in the asymptotic regime how $hat{theta}$ behaves,
          $$hat{theta} = dfrac{bar X}{1-bar X} rightarrow frac{E(X)}{1-E(X)} = frac{frac{theta}{theta+1} }{1-frac{theta}{theta+1} } = frac{theta}{theta+1} frac{theta+1}{1} = frac{theta}{1} = theta $$
          So, what can you say about $hat{theta}$ ?







          share|cite|improve this answer














          share|cite|improve this answer



          share|cite|improve this answer








          edited Jan 11 at 7:47

























          answered Jan 11 at 7:23









          Ahmad BazziAhmad Bazzi

          8,0812824




          8,0812824








          • 1




            $begingroup$
            Hey @Damandeep, yes it is not asymptotically consistent .. what does your textbook say?
            $endgroup$
            – Ahmad Bazzi
            Jan 11 at 7:38






          • 3




            $begingroup$
            Yeah because they got another estimator which is $hat{theta} = frac{bar{X}}{1- bar{X}}$ .. in your question you have a plus sign instead of a minus in the denominator .. makes a WHOLEEE lot of difference.
            $endgroup$
            – Ahmad Bazzi
            Jan 11 at 7:45








          • 2




            $begingroup$
            no worries, i have edited your question and my answer .. please see now :) @Damandeep
            $endgroup$
            – Ahmad Bazzi
            Jan 11 at 7:47








          • 2




            $begingroup$
            no need @Damandeep .. question is still the same ;) .. btw if you found the answer useful you can mark it as correct as well
            $endgroup$
            – Ahmad Bazzi
            Jan 11 at 7:51






          • 1




            $begingroup$
            An estimator is consistent if, as the sample size increases, the estimates "converge" to the true value of the parameter being estimated. That's it. What you mention ($V(hat{theta}) = 0$) is a consequence of certain types of consistent estimators.
            $endgroup$
            – Ahmad Bazzi
            Jan 11 at 8:44














          • 1




            $begingroup$
            Hey @Damandeep, yes it is not asymptotically consistent .. what does your textbook say?
            $endgroup$
            – Ahmad Bazzi
            Jan 11 at 7:38






          • 3




            $begingroup$
            Yeah because they got another estimator which is $hat{theta} = frac{bar{X}}{1- bar{X}}$ .. in your question you have a plus sign instead of a minus in the denominator .. makes a WHOLEEE lot of difference.
            $endgroup$
            – Ahmad Bazzi
            Jan 11 at 7:45








          • 2




            $begingroup$
            no worries, i have edited your question and my answer .. please see now :) @Damandeep
            $endgroup$
            – Ahmad Bazzi
            Jan 11 at 7:47








          • 2




            $begingroup$
            no need @Damandeep .. question is still the same ;) .. btw if you found the answer useful you can mark it as correct as well
            $endgroup$
            – Ahmad Bazzi
            Jan 11 at 7:51






          • 1




            $begingroup$
            An estimator is consistent if, as the sample size increases, the estimates "converge" to the true value of the parameter being estimated. That's it. What you mention ($V(hat{theta}) = 0$) is a consequence of certain types of consistent estimators.
            $endgroup$
            – Ahmad Bazzi
            Jan 11 at 8:44








          1




          1




          $begingroup$
          Hey @Damandeep, yes it is not asymptotically consistent .. what does your textbook say?
          $endgroup$
          – Ahmad Bazzi
          Jan 11 at 7:38




          $begingroup$
          Hey @Damandeep, yes it is not asymptotically consistent .. what does your textbook say?
          $endgroup$
          – Ahmad Bazzi
          Jan 11 at 7:38




          3




          3




          $begingroup$
          Yeah because they got another estimator which is $hat{theta} = frac{bar{X}}{1- bar{X}}$ .. in your question you have a plus sign instead of a minus in the denominator .. makes a WHOLEEE lot of difference.
          $endgroup$
          – Ahmad Bazzi
          Jan 11 at 7:45






          $begingroup$
          Yeah because they got another estimator which is $hat{theta} = frac{bar{X}}{1- bar{X}}$ .. in your question you have a plus sign instead of a minus in the denominator .. makes a WHOLEEE lot of difference.
          $endgroup$
          – Ahmad Bazzi
          Jan 11 at 7:45






          2




          2




          $begingroup$
          no worries, i have edited your question and my answer .. please see now :) @Damandeep
          $endgroup$
          – Ahmad Bazzi
          Jan 11 at 7:47






          $begingroup$
          no worries, i have edited your question and my answer .. please see now :) @Damandeep
          $endgroup$
          – Ahmad Bazzi
          Jan 11 at 7:47






          2




          2




          $begingroup$
          no need @Damandeep .. question is still the same ;) .. btw if you found the answer useful you can mark it as correct as well
          $endgroup$
          – Ahmad Bazzi
          Jan 11 at 7:51




          $begingroup$
          no need @Damandeep .. question is still the same ;) .. btw if you found the answer useful you can mark it as correct as well
          $endgroup$
          – Ahmad Bazzi
          Jan 11 at 7:51




          1




          1




          $begingroup$
          An estimator is consistent if, as the sample size increases, the estimates "converge" to the true value of the parameter being estimated. That's it. What you mention ($V(hat{theta}) = 0$) is a consequence of certain types of consistent estimators.
          $endgroup$
          – Ahmad Bazzi
          Jan 11 at 8:44




          $begingroup$
          An estimator is consistent if, as the sample size increases, the estimates "converge" to the true value of the parameter being estimated. That's it. What you mention ($V(hat{theta}) = 0$) is a consequence of certain types of consistent estimators.
          $endgroup$
          – Ahmad Bazzi
          Jan 11 at 8:44











          1












          $begingroup$

          I suppose the answer is no. Perhaps we may do as follows.



          Denote
          begin{align}
          S_n&=frac{1}{n}sum_{j=1}^nX_j,\
          T_n&=frac{S_n}{1+S_n}.
          end{align}

          Let us use the strong law of large number (SLLN) and the dominated convergence theorem (DCT) to go for the consistence.



          For one thing, each $X_jinleft[0,1right]$ implies that $S_ninleft[0,1right]$. Consequently, $T_nleft[0,1/2right]$, which implies that $left|T_nright|le 1/2$ for all $ninmathbb{N}$. This upper bound $1/2$ is obviously integrable under the probability measure, i.e., $mathbb{E}(1/2)=1/2<infty$.



          For another, by SLLN,
          $$
          lim_{ntoinfty}S_n=mathbb{E}X_1=frac{theta}{1+theta},
          $$

          which thus leads to
          $$
          lim_{ntoinfty}T_n=frac{lim_{ntoinfty}S_n}{1+lim_{ntoinfty}S_n}=frac{theta}{1+2theta}.
          $$



          Combine the above two facts, and DCT applies. This gives
          $$
          lim_{ntoinfty}mathbb{E}T_n=mathbb{E}left(lim_{ntoinfty}T_nright)=mathbb{E}frac{theta}{1+2theta}=frac{theta}{1+2theta}netheta.
          $$

          Therefore, $T_n$ is not a consistent estimator of $theta$.



          However, I suppose there might be a typo in $T_n$. In fact, if we consider another estimator
          $$
          U_n=frac{S_n}{1-S_n},
          $$

          the answer would be yes.



          The scope of its proof is essentially the same as above, although some tricks are necessary. This is because DCT would fail as $U_n$ is no longer bounded from above (more precisely, it is not trivial to figure out some $V$ such that $mathbb{E}V<infty$ and that $left|U_nright|le V$).



          To help facilitate our proof, define
          $$
          U_{mn}=U_ncdot 1_{left{U_nle mright}}.
          $$

          Note that $U_nge 1$ is guaranteed because $S_ninleft[0,1right]$. Besides, note that $U_{mn}$ is monotone in $m$, i.e.,
          $$
          U_{mn}le U_{m+1,n}.
          $$

          Thanks to these two facts, the monotone convergence theorem (MCT) applies. It gives
          $$
          mathbb{E}left(lim_{mtoinfty}U_{mn}right)=lim_{mtoinfty}mathbb{E}U_{mn}.
          $$

          Consequently,
          $$
          lim_{ntoinfty}mathbb{E}U_n=lim_{ntoinfty}mathbb{E}left(lim_{mtoinfty}U_{mn}right)=lim_{ntoinfty}lim_{mtoinfty}mathbb{E}U_{mn}.
          $$

          Further, for each fixed $m$, $mathbb{E}U_{mn}$ converges as $ntoinfty$ by DCT, and for each fixed $n$, $mathbb{E}U_{mn}$ converges uniformly as $mtoinfty$ due to the uniform cutoff. These facts inspires that
          $$
          lim_{ntoinfty}lim_{mtoinfty}mathbb{E}U_{mn}=lim_{mtoinfty}lim_{ntoinfty}mathbb{E}U_{mn}.
          $$



          Thanks to all these arguments, we may safely conclude that
          $$
          lim_{ntoinfty}mathbb{E}U_n=lim_{mtoinfty}lim_{ntoinfty}mathbb{E}U_{mn}=lim_{mtoinfty}mathbb{E}left(lim_{ntoinfty}U_{mn}right)=cdots=theta.
          $$






          share|cite|improve this answer









          $endgroup$













          • $begingroup$
            Thanks for your effort
            $endgroup$
            – Daman deep
            Jan 11 at 7:49










          • $begingroup$
            @Damandeep: No problem :-)
            $endgroup$
            – hypernova
            Jan 11 at 8:06
















          1












          $begingroup$

          I suppose the answer is no. Perhaps we may do as follows.



          Denote
          begin{align}
          S_n&=frac{1}{n}sum_{j=1}^nX_j,\
          T_n&=frac{S_n}{1+S_n}.
          end{align}

          Let us use the strong law of large number (SLLN) and the dominated convergence theorem (DCT) to go for the consistence.



          For one thing, each $X_jinleft[0,1right]$ implies that $S_ninleft[0,1right]$. Consequently, $T_nleft[0,1/2right]$, which implies that $left|T_nright|le 1/2$ for all $ninmathbb{N}$. This upper bound $1/2$ is obviously integrable under the probability measure, i.e., $mathbb{E}(1/2)=1/2<infty$.



          For another, by SLLN,
          $$
          lim_{ntoinfty}S_n=mathbb{E}X_1=frac{theta}{1+theta},
          $$

          which thus leads to
          $$
          lim_{ntoinfty}T_n=frac{lim_{ntoinfty}S_n}{1+lim_{ntoinfty}S_n}=frac{theta}{1+2theta}.
          $$



          Combine the above two facts, and DCT applies. This gives
          $$
          lim_{ntoinfty}mathbb{E}T_n=mathbb{E}left(lim_{ntoinfty}T_nright)=mathbb{E}frac{theta}{1+2theta}=frac{theta}{1+2theta}netheta.
          $$

          Therefore, $T_n$ is not a consistent estimator of $theta$.



          However, I suppose there might be a typo in $T_n$. In fact, if we consider another estimator
          $$
          U_n=frac{S_n}{1-S_n},
          $$

          the answer would be yes.



          The scope of its proof is essentially the same as above, although some tricks are necessary. This is because DCT would fail as $U_n$ is no longer bounded from above (more precisely, it is not trivial to figure out some $V$ such that $mathbb{E}V<infty$ and that $left|U_nright|le V$).



          To help facilitate our proof, define
          $$
          U_{mn}=U_ncdot 1_{left{U_nle mright}}.
          $$

          Note that $U_nge 1$ is guaranteed because $S_ninleft[0,1right]$. Besides, note that $U_{mn}$ is monotone in $m$, i.e.,
          $$
          U_{mn}le U_{m+1,n}.
          $$

          Thanks to these two facts, the monotone convergence theorem (MCT) applies. It gives
          $$
          mathbb{E}left(lim_{mtoinfty}U_{mn}right)=lim_{mtoinfty}mathbb{E}U_{mn}.
          $$

          Consequently,
          $$
          lim_{ntoinfty}mathbb{E}U_n=lim_{ntoinfty}mathbb{E}left(lim_{mtoinfty}U_{mn}right)=lim_{ntoinfty}lim_{mtoinfty}mathbb{E}U_{mn}.
          $$

          Further, for each fixed $m$, $mathbb{E}U_{mn}$ converges as $ntoinfty$ by DCT, and for each fixed $n$, $mathbb{E}U_{mn}$ converges uniformly as $mtoinfty$ due to the uniform cutoff. These facts inspires that
          $$
          lim_{ntoinfty}lim_{mtoinfty}mathbb{E}U_{mn}=lim_{mtoinfty}lim_{ntoinfty}mathbb{E}U_{mn}.
          $$



          Thanks to all these arguments, we may safely conclude that
          $$
          lim_{ntoinfty}mathbb{E}U_n=lim_{mtoinfty}lim_{ntoinfty}mathbb{E}U_{mn}=lim_{mtoinfty}mathbb{E}left(lim_{ntoinfty}U_{mn}right)=cdots=theta.
          $$






          share|cite|improve this answer









          $endgroup$













          • $begingroup$
            Thanks for your effort
            $endgroup$
            – Daman deep
            Jan 11 at 7:49










          • $begingroup$
            @Damandeep: No problem :-)
            $endgroup$
            – hypernova
            Jan 11 at 8:06














          1












          1








          1





          $begingroup$

          I suppose the answer is no. Perhaps we may do as follows.



          Denote
          begin{align}
          S_n&=frac{1}{n}sum_{j=1}^nX_j,\
          T_n&=frac{S_n}{1+S_n}.
          end{align}

          Let us use the strong law of large number (SLLN) and the dominated convergence theorem (DCT) to go for the consistence.



          For one thing, each $X_jinleft[0,1right]$ implies that $S_ninleft[0,1right]$. Consequently, $T_nleft[0,1/2right]$, which implies that $left|T_nright|le 1/2$ for all $ninmathbb{N}$. This upper bound $1/2$ is obviously integrable under the probability measure, i.e., $mathbb{E}(1/2)=1/2<infty$.



          For another, by SLLN,
          $$
          lim_{ntoinfty}S_n=mathbb{E}X_1=frac{theta}{1+theta},
          $$

          which thus leads to
          $$
          lim_{ntoinfty}T_n=frac{lim_{ntoinfty}S_n}{1+lim_{ntoinfty}S_n}=frac{theta}{1+2theta}.
          $$



          Combine the above two facts, and DCT applies. This gives
          $$
          lim_{ntoinfty}mathbb{E}T_n=mathbb{E}left(lim_{ntoinfty}T_nright)=mathbb{E}frac{theta}{1+2theta}=frac{theta}{1+2theta}netheta.
          $$

          Therefore, $T_n$ is not a consistent estimator of $theta$.



          However, I suppose there might be a typo in $T_n$. In fact, if we consider another estimator
          $$
          U_n=frac{S_n}{1-S_n},
          $$

          the answer would be yes.



          The scope of its proof is essentially the same as above, although some tricks are necessary. This is because DCT would fail as $U_n$ is no longer bounded from above (more precisely, it is not trivial to figure out some $V$ such that $mathbb{E}V<infty$ and that $left|U_nright|le V$).



          To help facilitate our proof, define
          $$
          U_{mn}=U_ncdot 1_{left{U_nle mright}}.
          $$

          Note that $U_nge 1$ is guaranteed because $S_ninleft[0,1right]$. Besides, note that $U_{mn}$ is monotone in $m$, i.e.,
          $$
          U_{mn}le U_{m+1,n}.
          $$

          Thanks to these two facts, the monotone convergence theorem (MCT) applies. It gives
          $$
          mathbb{E}left(lim_{mtoinfty}U_{mn}right)=lim_{mtoinfty}mathbb{E}U_{mn}.
          $$

          Consequently,
          $$
          lim_{ntoinfty}mathbb{E}U_n=lim_{ntoinfty}mathbb{E}left(lim_{mtoinfty}U_{mn}right)=lim_{ntoinfty}lim_{mtoinfty}mathbb{E}U_{mn}.
          $$

          Further, for each fixed $m$, $mathbb{E}U_{mn}$ converges as $ntoinfty$ by DCT, and for each fixed $n$, $mathbb{E}U_{mn}$ converges uniformly as $mtoinfty$ due to the uniform cutoff. These facts inspires that
          $$
          lim_{ntoinfty}lim_{mtoinfty}mathbb{E}U_{mn}=lim_{mtoinfty}lim_{ntoinfty}mathbb{E}U_{mn}.
          $$



          Thanks to all these arguments, we may safely conclude that
          $$
          lim_{ntoinfty}mathbb{E}U_n=lim_{mtoinfty}lim_{ntoinfty}mathbb{E}U_{mn}=lim_{mtoinfty}mathbb{E}left(lim_{ntoinfty}U_{mn}right)=cdots=theta.
          $$






          share|cite|improve this answer









          $endgroup$



          I suppose the answer is no. Perhaps we may do as follows.



          Denote
          begin{align}
          S_n&=frac{1}{n}sum_{j=1}^nX_j,\
          T_n&=frac{S_n}{1+S_n}.
          end{align}

          Let us use the strong law of large number (SLLN) and the dominated convergence theorem (DCT) to go for the consistence.



          For one thing, each $X_jinleft[0,1right]$ implies that $S_ninleft[0,1right]$. Consequently, $T_nleft[0,1/2right]$, which implies that $left|T_nright|le 1/2$ for all $ninmathbb{N}$. This upper bound $1/2$ is obviously integrable under the probability measure, i.e., $mathbb{E}(1/2)=1/2<infty$.



          For another, by SLLN,
          $$
          lim_{ntoinfty}S_n=mathbb{E}X_1=frac{theta}{1+theta},
          $$

          which thus leads to
          $$
          lim_{ntoinfty}T_n=frac{lim_{ntoinfty}S_n}{1+lim_{ntoinfty}S_n}=frac{theta}{1+2theta}.
          $$



          Combine the above two facts, and DCT applies. This gives
          $$
          lim_{ntoinfty}mathbb{E}T_n=mathbb{E}left(lim_{ntoinfty}T_nright)=mathbb{E}frac{theta}{1+2theta}=frac{theta}{1+2theta}netheta.
          $$

          Therefore, $T_n$ is not a consistent estimator of $theta$.



          However, I suppose there might be a typo in $T_n$. In fact, if we consider another estimator
          $$
          U_n=frac{S_n}{1-S_n},
          $$

          the answer would be yes.



          The scope of its proof is essentially the same as above, although some tricks are necessary. This is because DCT would fail as $U_n$ is no longer bounded from above (more precisely, it is not trivial to figure out some $V$ such that $mathbb{E}V<infty$ and that $left|U_nright|le V$).



          To help facilitate our proof, define
          $$
          U_{mn}=U_ncdot 1_{left{U_nle mright}}.
          $$

          Note that $U_nge 1$ is guaranteed because $S_ninleft[0,1right]$. Besides, note that $U_{mn}$ is monotone in $m$, i.e.,
          $$
          U_{mn}le U_{m+1,n}.
          $$

          Thanks to these two facts, the monotone convergence theorem (MCT) applies. It gives
          $$
          mathbb{E}left(lim_{mtoinfty}U_{mn}right)=lim_{mtoinfty}mathbb{E}U_{mn}.
          $$

          Consequently,
          $$
          lim_{ntoinfty}mathbb{E}U_n=lim_{ntoinfty}mathbb{E}left(lim_{mtoinfty}U_{mn}right)=lim_{ntoinfty}lim_{mtoinfty}mathbb{E}U_{mn}.
          $$

          Further, for each fixed $m$, $mathbb{E}U_{mn}$ converges as $ntoinfty$ by DCT, and for each fixed $n$, $mathbb{E}U_{mn}$ converges uniformly as $mtoinfty$ due to the uniform cutoff. These facts inspires that
          $$
          lim_{ntoinfty}lim_{mtoinfty}mathbb{E}U_{mn}=lim_{mtoinfty}lim_{ntoinfty}mathbb{E}U_{mn}.
          $$



          Thanks to all these arguments, we may safely conclude that
          $$
          lim_{ntoinfty}mathbb{E}U_n=lim_{mtoinfty}lim_{ntoinfty}mathbb{E}U_{mn}=lim_{mtoinfty}mathbb{E}left(lim_{ntoinfty}U_{mn}right)=cdots=theta.
          $$







          share|cite|improve this answer












          share|cite|improve this answer



          share|cite|improve this answer










          answered Jan 11 at 7:36









          hypernovahypernova

          4,779414




          4,779414












          • $begingroup$
            Thanks for your effort
            $endgroup$
            – Daman deep
            Jan 11 at 7:49










          • $begingroup$
            @Damandeep: No problem :-)
            $endgroup$
            – hypernova
            Jan 11 at 8:06


















          • $begingroup$
            Thanks for your effort
            $endgroup$
            – Daman deep
            Jan 11 at 7:49










          • $begingroup$
            @Damandeep: No problem :-)
            $endgroup$
            – hypernova
            Jan 11 at 8:06
















          $begingroup$
          Thanks for your effort
          $endgroup$
          – Daman deep
          Jan 11 at 7:49




          $begingroup$
          Thanks for your effort
          $endgroup$
          – Daman deep
          Jan 11 at 7:49












          $begingroup$
          @Damandeep: No problem :-)
          $endgroup$
          – hypernova
          Jan 11 at 8:06




          $begingroup$
          @Damandeep: No problem :-)
          $endgroup$
          – hypernova
          Jan 11 at 8:06


















          draft saved

          draft discarded




















































          Thanks for contributing an answer to Mathematics Stack Exchange!


          • Please be sure to answer the question. Provide details and share your research!

          But avoid



          • Asking for help, clarification, or responding to other answers.

          • Making statements based on opinion; back them up with references or personal experience.


          Use MathJax to format equations. MathJax reference.


          To learn more, see our tips on writing great answers.




          draft saved


          draft discarded














          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3069502%2fis-estimator-dfrac-bar-x1-bar-x-of-theta-is-consistent%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown





















































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown

































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown







          Popular posts from this blog

          MongoDB - Not Authorized To Execute Command

          How to fix TextFormField cause rebuild widget in Flutter

          in spring boot 2.1 many test slices are not allowed anymore due to multiple @BootstrapWith