Showing weighted average is consistent estimate












3












$begingroup$


Here's the problem statement: Let $X_1$, . . . , $X_n$ be independent random variables with common mean $mu$
and variances $σ_i^2$ . To estimate $μ$, we use the weighted average
$T_n$ = $sum_{i=1}^n w_iX_i$



with weights
$w_i$ = $frac{sigma_i^{−2}}{sum_{i=1}^n sigma_j^{-2}}$



Show that the estimate $T_n$ is consistent (in probability) if $sum_{i=1}^n sigma_j^{-2} Rightarrow infty$.
End Problem.



So I know that a consistent estimator is one that asymptotically approaches the parameter as the sample size goes to infinity. So by the problem statement, if the denominator of each of the weightings goes to infinity, then each of the $w_iX_i$ would just go to 0, yes? I'm a little confused on what this problem is trying to show. Thanks for any help!










share|cite|improve this question









$endgroup$

















    3












    $begingroup$


    Here's the problem statement: Let $X_1$, . . . , $X_n$ be independent random variables with common mean $mu$
    and variances $σ_i^2$ . To estimate $μ$, we use the weighted average
    $T_n$ = $sum_{i=1}^n w_iX_i$



    with weights
    $w_i$ = $frac{sigma_i^{−2}}{sum_{i=1}^n sigma_j^{-2}}$



    Show that the estimate $T_n$ is consistent (in probability) if $sum_{i=1}^n sigma_j^{-2} Rightarrow infty$.
    End Problem.



    So I know that a consistent estimator is one that asymptotically approaches the parameter as the sample size goes to infinity. So by the problem statement, if the denominator of each of the weightings goes to infinity, then each of the $w_iX_i$ would just go to 0, yes? I'm a little confused on what this problem is trying to show. Thanks for any help!










    share|cite|improve this question









    $endgroup$















      3












      3








      3





      $begingroup$


      Here's the problem statement: Let $X_1$, . . . , $X_n$ be independent random variables with common mean $mu$
      and variances $σ_i^2$ . To estimate $μ$, we use the weighted average
      $T_n$ = $sum_{i=1}^n w_iX_i$



      with weights
      $w_i$ = $frac{sigma_i^{−2}}{sum_{i=1}^n sigma_j^{-2}}$



      Show that the estimate $T_n$ is consistent (in probability) if $sum_{i=1}^n sigma_j^{-2} Rightarrow infty$.
      End Problem.



      So I know that a consistent estimator is one that asymptotically approaches the parameter as the sample size goes to infinity. So by the problem statement, if the denominator of each of the weightings goes to infinity, then each of the $w_iX_i$ would just go to 0, yes? I'm a little confused on what this problem is trying to show. Thanks for any help!










      share|cite|improve this question









      $endgroup$




      Here's the problem statement: Let $X_1$, . . . , $X_n$ be independent random variables with common mean $mu$
      and variances $σ_i^2$ . To estimate $μ$, we use the weighted average
      $T_n$ = $sum_{i=1}^n w_iX_i$



      with weights
      $w_i$ = $frac{sigma_i^{−2}}{sum_{i=1}^n sigma_j^{-2}}$



      Show that the estimate $T_n$ is consistent (in probability) if $sum_{i=1}^n sigma_j^{-2} Rightarrow infty$.
      End Problem.



      So I know that a consistent estimator is one that asymptotically approaches the parameter as the sample size goes to infinity. So by the problem statement, if the denominator of each of the weightings goes to infinity, then each of the $w_iX_i$ would just go to 0, yes? I'm a little confused on what this problem is trying to show. Thanks for any help!







      probability-theory estimation law-of-large-numbers






      share|cite|improve this question













      share|cite|improve this question











      share|cite|improve this question




      share|cite|improve this question










      asked Jan 31 at 18:48









      psunpsun

      211




      211






















          2 Answers
          2






          active

          oldest

          votes


















          1












          $begingroup$

          Yes, $w_{i}X_{i}$ go to 0, but you are interested in the behaviour of $T_{n}$ . Convergence in probability means .



          begin{equation}
          lim_{n rightarrow infty }P(|T_{n} - mu | ge epsilon) = 0 , forall epsilon > 0
          end{equation}



          Try using some concentration inequality involving the variance .






          share|cite|improve this answer











          $endgroup$













          • $begingroup$
            It seems that the Bernstein inequality you want to use require finite moments of any order, which is not specified in the question and not needed to reach the conclusion with the assumptions of the problem.
            $endgroup$
            – Davide Giraudo
            Feb 1 at 14:25












          • $begingroup$
            You are right , I see the variables can be unbounded .I will edit the answare . Thanks !.
            $endgroup$
            – Popescu Claudiu
            Feb 1 at 17:42





















          1












          $begingroup$

          Here are some steps:




          1. Without loss of generality, $mu=0$.

          2. Compute $operatorname{Var}left(T_nright)$. Using independence this reduces to $sum_{i=1}^noperatorname{Var}left(omega_iX_iright)$. Since $$operatorname{Var}left(omega_iX_iright)=omega_i^2sigma_i^2,$$
            it follows that
            $$
            operatorname{Var}left(T_nright)=sum_{i=1}^nleft(frac{sigma_i^{-2}}{sum_{j=1}^nsigma_j^{-2}}right)^2sigma_i^2=sum_{i=1}^nsigma_i^{-2}frac 1{left(sum_{j=1}^nsigma_j^{-2}right)^2}=frac{1}{sum_{j=1}^nsigma_j^{-2}}.
            $$


          3. Since $mathbb Eleft[T_nright]=mu$, we get that $mathbb Eleft[left(T_n-muright)^2right]to 0$ from which the convergence in probability follows.






          share|cite|improve this answer











          $endgroup$













          • $begingroup$
            I'm a little confused by how setting $mu$ = 0 helps. When I calculate Var($T_n$), I get the following: Var($T_n$) = $sum w_i Var(sum(X_i)) = sum w_i sum Var((X_i)) = sum frac{sigma_i^{-2}}{sum sigma_j^{-2}}sum sigma_i^2$, and I'm not sure how this variance shows that the sum $T_n$ converges to the mean.
            $endgroup$
            – psun
            Feb 1 at 20:53












          • $begingroup$
            It have edited. Actually this computation works also if $muneq 0$.
            $endgroup$
            – Davide Giraudo
            Feb 1 at 21:25










          • $begingroup$
            So by showing this is the variance of $T_n$, the variance of this estimate goes to 0. That much I understand. I guess what I don't quite get is how this necessarily shows that $T_n$ converges to the population mean $mu$.
            $endgroup$
            – psun
            Feb 1 at 23:05












          • $begingroup$
            @psun I have edited.
            $endgroup$
            – Davide Giraudo
            Feb 2 at 18:27










          • $begingroup$
            Thanks for everything! This question is clear once you see all the moving parts :)
            $endgroup$
            – psun
            Feb 4 at 17:37












          Your Answer





          StackExchange.ifUsing("editor", function () {
          return StackExchange.using("mathjaxEditing", function () {
          StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
          StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
          });
          });
          }, "mathjax-editing");

          StackExchange.ready(function() {
          var channelOptions = {
          tags: "".split(" "),
          id: "69"
          };
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function() {
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled) {
          StackExchange.using("snippets", function() {
          createEditor();
          });
          }
          else {
          createEditor();
          }
          });

          function createEditor() {
          StackExchange.prepareEditor({
          heartbeatType: 'answer',
          autoActivateHeartbeat: false,
          convertImagesToLinks: true,
          noModals: true,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: 10,
          bindNavPrevention: true,
          postfix: "",
          imageUploader: {
          brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
          contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
          allowUrls: true
          },
          noCode: true, onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          });


          }
          });














          draft saved

          draft discarded


















          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3095296%2fshowing-weighted-average-is-consistent-estimate%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown

























          2 Answers
          2






          active

          oldest

          votes








          2 Answers
          2






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes









          1












          $begingroup$

          Yes, $w_{i}X_{i}$ go to 0, but you are interested in the behaviour of $T_{n}$ . Convergence in probability means .



          begin{equation}
          lim_{n rightarrow infty }P(|T_{n} - mu | ge epsilon) = 0 , forall epsilon > 0
          end{equation}



          Try using some concentration inequality involving the variance .






          share|cite|improve this answer











          $endgroup$













          • $begingroup$
            It seems that the Bernstein inequality you want to use require finite moments of any order, which is not specified in the question and not needed to reach the conclusion with the assumptions of the problem.
            $endgroup$
            – Davide Giraudo
            Feb 1 at 14:25












          • $begingroup$
            You are right , I see the variables can be unbounded .I will edit the answare . Thanks !.
            $endgroup$
            – Popescu Claudiu
            Feb 1 at 17:42


















          1












          $begingroup$

          Yes, $w_{i}X_{i}$ go to 0, but you are interested in the behaviour of $T_{n}$ . Convergence in probability means .



          begin{equation}
          lim_{n rightarrow infty }P(|T_{n} - mu | ge epsilon) = 0 , forall epsilon > 0
          end{equation}



          Try using some concentration inequality involving the variance .






          share|cite|improve this answer











          $endgroup$













          • $begingroup$
            It seems that the Bernstein inequality you want to use require finite moments of any order, which is not specified in the question and not needed to reach the conclusion with the assumptions of the problem.
            $endgroup$
            – Davide Giraudo
            Feb 1 at 14:25












          • $begingroup$
            You are right , I see the variables can be unbounded .I will edit the answare . Thanks !.
            $endgroup$
            – Popescu Claudiu
            Feb 1 at 17:42
















          1












          1








          1





          $begingroup$

          Yes, $w_{i}X_{i}$ go to 0, but you are interested in the behaviour of $T_{n}$ . Convergence in probability means .



          begin{equation}
          lim_{n rightarrow infty }P(|T_{n} - mu | ge epsilon) = 0 , forall epsilon > 0
          end{equation}



          Try using some concentration inequality involving the variance .






          share|cite|improve this answer











          $endgroup$



          Yes, $w_{i}X_{i}$ go to 0, but you are interested in the behaviour of $T_{n}$ . Convergence in probability means .



          begin{equation}
          lim_{n rightarrow infty }P(|T_{n} - mu | ge epsilon) = 0 , forall epsilon > 0
          end{equation}



          Try using some concentration inequality involving the variance .







          share|cite|improve this answer














          share|cite|improve this answer



          share|cite|improve this answer








          edited Feb 1 at 17:44

























          answered Jan 31 at 19:33









          Popescu ClaudiuPopescu Claudiu

          4019




          4019












          • $begingroup$
            It seems that the Bernstein inequality you want to use require finite moments of any order, which is not specified in the question and not needed to reach the conclusion with the assumptions of the problem.
            $endgroup$
            – Davide Giraudo
            Feb 1 at 14:25












          • $begingroup$
            You are right , I see the variables can be unbounded .I will edit the answare . Thanks !.
            $endgroup$
            – Popescu Claudiu
            Feb 1 at 17:42




















          • $begingroup$
            It seems that the Bernstein inequality you want to use require finite moments of any order, which is not specified in the question and not needed to reach the conclusion with the assumptions of the problem.
            $endgroup$
            – Davide Giraudo
            Feb 1 at 14:25












          • $begingroup$
            You are right , I see the variables can be unbounded .I will edit the answare . Thanks !.
            $endgroup$
            – Popescu Claudiu
            Feb 1 at 17:42


















          $begingroup$
          It seems that the Bernstein inequality you want to use require finite moments of any order, which is not specified in the question and not needed to reach the conclusion with the assumptions of the problem.
          $endgroup$
          – Davide Giraudo
          Feb 1 at 14:25






          $begingroup$
          It seems that the Bernstein inequality you want to use require finite moments of any order, which is not specified in the question and not needed to reach the conclusion with the assumptions of the problem.
          $endgroup$
          – Davide Giraudo
          Feb 1 at 14:25














          $begingroup$
          You are right , I see the variables can be unbounded .I will edit the answare . Thanks !.
          $endgroup$
          – Popescu Claudiu
          Feb 1 at 17:42






          $begingroup$
          You are right , I see the variables can be unbounded .I will edit the answare . Thanks !.
          $endgroup$
          – Popescu Claudiu
          Feb 1 at 17:42













          1












          $begingroup$

          Here are some steps:




          1. Without loss of generality, $mu=0$.

          2. Compute $operatorname{Var}left(T_nright)$. Using independence this reduces to $sum_{i=1}^noperatorname{Var}left(omega_iX_iright)$. Since $$operatorname{Var}left(omega_iX_iright)=omega_i^2sigma_i^2,$$
            it follows that
            $$
            operatorname{Var}left(T_nright)=sum_{i=1}^nleft(frac{sigma_i^{-2}}{sum_{j=1}^nsigma_j^{-2}}right)^2sigma_i^2=sum_{i=1}^nsigma_i^{-2}frac 1{left(sum_{j=1}^nsigma_j^{-2}right)^2}=frac{1}{sum_{j=1}^nsigma_j^{-2}}.
            $$


          3. Since $mathbb Eleft[T_nright]=mu$, we get that $mathbb Eleft[left(T_n-muright)^2right]to 0$ from which the convergence in probability follows.






          share|cite|improve this answer











          $endgroup$













          • $begingroup$
            I'm a little confused by how setting $mu$ = 0 helps. When I calculate Var($T_n$), I get the following: Var($T_n$) = $sum w_i Var(sum(X_i)) = sum w_i sum Var((X_i)) = sum frac{sigma_i^{-2}}{sum sigma_j^{-2}}sum sigma_i^2$, and I'm not sure how this variance shows that the sum $T_n$ converges to the mean.
            $endgroup$
            – psun
            Feb 1 at 20:53












          • $begingroup$
            It have edited. Actually this computation works also if $muneq 0$.
            $endgroup$
            – Davide Giraudo
            Feb 1 at 21:25










          • $begingroup$
            So by showing this is the variance of $T_n$, the variance of this estimate goes to 0. That much I understand. I guess what I don't quite get is how this necessarily shows that $T_n$ converges to the population mean $mu$.
            $endgroup$
            – psun
            Feb 1 at 23:05












          • $begingroup$
            @psun I have edited.
            $endgroup$
            – Davide Giraudo
            Feb 2 at 18:27










          • $begingroup$
            Thanks for everything! This question is clear once you see all the moving parts :)
            $endgroup$
            – psun
            Feb 4 at 17:37
















          1












          $begingroup$

          Here are some steps:




          1. Without loss of generality, $mu=0$.

          2. Compute $operatorname{Var}left(T_nright)$. Using independence this reduces to $sum_{i=1}^noperatorname{Var}left(omega_iX_iright)$. Since $$operatorname{Var}left(omega_iX_iright)=omega_i^2sigma_i^2,$$
            it follows that
            $$
            operatorname{Var}left(T_nright)=sum_{i=1}^nleft(frac{sigma_i^{-2}}{sum_{j=1}^nsigma_j^{-2}}right)^2sigma_i^2=sum_{i=1}^nsigma_i^{-2}frac 1{left(sum_{j=1}^nsigma_j^{-2}right)^2}=frac{1}{sum_{j=1}^nsigma_j^{-2}}.
            $$


          3. Since $mathbb Eleft[T_nright]=mu$, we get that $mathbb Eleft[left(T_n-muright)^2right]to 0$ from which the convergence in probability follows.






          share|cite|improve this answer











          $endgroup$













          • $begingroup$
            I'm a little confused by how setting $mu$ = 0 helps. When I calculate Var($T_n$), I get the following: Var($T_n$) = $sum w_i Var(sum(X_i)) = sum w_i sum Var((X_i)) = sum frac{sigma_i^{-2}}{sum sigma_j^{-2}}sum sigma_i^2$, and I'm not sure how this variance shows that the sum $T_n$ converges to the mean.
            $endgroup$
            – psun
            Feb 1 at 20:53












          • $begingroup$
            It have edited. Actually this computation works also if $muneq 0$.
            $endgroup$
            – Davide Giraudo
            Feb 1 at 21:25










          • $begingroup$
            So by showing this is the variance of $T_n$, the variance of this estimate goes to 0. That much I understand. I guess what I don't quite get is how this necessarily shows that $T_n$ converges to the population mean $mu$.
            $endgroup$
            – psun
            Feb 1 at 23:05












          • $begingroup$
            @psun I have edited.
            $endgroup$
            – Davide Giraudo
            Feb 2 at 18:27










          • $begingroup$
            Thanks for everything! This question is clear once you see all the moving parts :)
            $endgroup$
            – psun
            Feb 4 at 17:37














          1












          1








          1





          $begingroup$

          Here are some steps:




          1. Without loss of generality, $mu=0$.

          2. Compute $operatorname{Var}left(T_nright)$. Using independence this reduces to $sum_{i=1}^noperatorname{Var}left(omega_iX_iright)$. Since $$operatorname{Var}left(omega_iX_iright)=omega_i^2sigma_i^2,$$
            it follows that
            $$
            operatorname{Var}left(T_nright)=sum_{i=1}^nleft(frac{sigma_i^{-2}}{sum_{j=1}^nsigma_j^{-2}}right)^2sigma_i^2=sum_{i=1}^nsigma_i^{-2}frac 1{left(sum_{j=1}^nsigma_j^{-2}right)^2}=frac{1}{sum_{j=1}^nsigma_j^{-2}}.
            $$


          3. Since $mathbb Eleft[T_nright]=mu$, we get that $mathbb Eleft[left(T_n-muright)^2right]to 0$ from which the convergence in probability follows.






          share|cite|improve this answer











          $endgroup$



          Here are some steps:




          1. Without loss of generality, $mu=0$.

          2. Compute $operatorname{Var}left(T_nright)$. Using independence this reduces to $sum_{i=1}^noperatorname{Var}left(omega_iX_iright)$. Since $$operatorname{Var}left(omega_iX_iright)=omega_i^2sigma_i^2,$$
            it follows that
            $$
            operatorname{Var}left(T_nright)=sum_{i=1}^nleft(frac{sigma_i^{-2}}{sum_{j=1}^nsigma_j^{-2}}right)^2sigma_i^2=sum_{i=1}^nsigma_i^{-2}frac 1{left(sum_{j=1}^nsigma_j^{-2}right)^2}=frac{1}{sum_{j=1}^nsigma_j^{-2}}.
            $$


          3. Since $mathbb Eleft[T_nright]=mu$, we get that $mathbb Eleft[left(T_n-muright)^2right]to 0$ from which the convergence in probability follows.







          share|cite|improve this answer














          share|cite|improve this answer



          share|cite|improve this answer








          edited Feb 1 at 23:12

























          answered Feb 1 at 11:30









          Davide GiraudoDavide Giraudo

          128k17156268




          128k17156268












          • $begingroup$
            I'm a little confused by how setting $mu$ = 0 helps. When I calculate Var($T_n$), I get the following: Var($T_n$) = $sum w_i Var(sum(X_i)) = sum w_i sum Var((X_i)) = sum frac{sigma_i^{-2}}{sum sigma_j^{-2}}sum sigma_i^2$, and I'm not sure how this variance shows that the sum $T_n$ converges to the mean.
            $endgroup$
            – psun
            Feb 1 at 20:53












          • $begingroup$
            It have edited. Actually this computation works also if $muneq 0$.
            $endgroup$
            – Davide Giraudo
            Feb 1 at 21:25










          • $begingroup$
            So by showing this is the variance of $T_n$, the variance of this estimate goes to 0. That much I understand. I guess what I don't quite get is how this necessarily shows that $T_n$ converges to the population mean $mu$.
            $endgroup$
            – psun
            Feb 1 at 23:05












          • $begingroup$
            @psun I have edited.
            $endgroup$
            – Davide Giraudo
            Feb 2 at 18:27










          • $begingroup$
            Thanks for everything! This question is clear once you see all the moving parts :)
            $endgroup$
            – psun
            Feb 4 at 17:37


















          • $begingroup$
            I'm a little confused by how setting $mu$ = 0 helps. When I calculate Var($T_n$), I get the following: Var($T_n$) = $sum w_i Var(sum(X_i)) = sum w_i sum Var((X_i)) = sum frac{sigma_i^{-2}}{sum sigma_j^{-2}}sum sigma_i^2$, and I'm not sure how this variance shows that the sum $T_n$ converges to the mean.
            $endgroup$
            – psun
            Feb 1 at 20:53












          • $begingroup$
            It have edited. Actually this computation works also if $muneq 0$.
            $endgroup$
            – Davide Giraudo
            Feb 1 at 21:25










          • $begingroup$
            So by showing this is the variance of $T_n$, the variance of this estimate goes to 0. That much I understand. I guess what I don't quite get is how this necessarily shows that $T_n$ converges to the population mean $mu$.
            $endgroup$
            – psun
            Feb 1 at 23:05












          • $begingroup$
            @psun I have edited.
            $endgroup$
            – Davide Giraudo
            Feb 2 at 18:27










          • $begingroup$
            Thanks for everything! This question is clear once you see all the moving parts :)
            $endgroup$
            – psun
            Feb 4 at 17:37
















          $begingroup$
          I'm a little confused by how setting $mu$ = 0 helps. When I calculate Var($T_n$), I get the following: Var($T_n$) = $sum w_i Var(sum(X_i)) = sum w_i sum Var((X_i)) = sum frac{sigma_i^{-2}}{sum sigma_j^{-2}}sum sigma_i^2$, and I'm not sure how this variance shows that the sum $T_n$ converges to the mean.
          $endgroup$
          – psun
          Feb 1 at 20:53






          $begingroup$
          I'm a little confused by how setting $mu$ = 0 helps. When I calculate Var($T_n$), I get the following: Var($T_n$) = $sum w_i Var(sum(X_i)) = sum w_i sum Var((X_i)) = sum frac{sigma_i^{-2}}{sum sigma_j^{-2}}sum sigma_i^2$, and I'm not sure how this variance shows that the sum $T_n$ converges to the mean.
          $endgroup$
          – psun
          Feb 1 at 20:53














          $begingroup$
          It have edited. Actually this computation works also if $muneq 0$.
          $endgroup$
          – Davide Giraudo
          Feb 1 at 21:25




          $begingroup$
          It have edited. Actually this computation works also if $muneq 0$.
          $endgroup$
          – Davide Giraudo
          Feb 1 at 21:25












          $begingroup$
          So by showing this is the variance of $T_n$, the variance of this estimate goes to 0. That much I understand. I guess what I don't quite get is how this necessarily shows that $T_n$ converges to the population mean $mu$.
          $endgroup$
          – psun
          Feb 1 at 23:05






          $begingroup$
          So by showing this is the variance of $T_n$, the variance of this estimate goes to 0. That much I understand. I guess what I don't quite get is how this necessarily shows that $T_n$ converges to the population mean $mu$.
          $endgroup$
          – psun
          Feb 1 at 23:05














          $begingroup$
          @psun I have edited.
          $endgroup$
          – Davide Giraudo
          Feb 2 at 18:27




          $begingroup$
          @psun I have edited.
          $endgroup$
          – Davide Giraudo
          Feb 2 at 18:27












          $begingroup$
          Thanks for everything! This question is clear once you see all the moving parts :)
          $endgroup$
          – psun
          Feb 4 at 17:37




          $begingroup$
          Thanks for everything! This question is clear once you see all the moving parts :)
          $endgroup$
          – psun
          Feb 4 at 17:37


















          draft saved

          draft discarded




















































          Thanks for contributing an answer to Mathematics Stack Exchange!


          • Please be sure to answer the question. Provide details and share your research!

          But avoid



          • Asking for help, clarification, or responding to other answers.

          • Making statements based on opinion; back them up with references or personal experience.


          Use MathJax to format equations. MathJax reference.


          To learn more, see our tips on writing great answers.




          draft saved


          draft discarded














          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3095296%2fshowing-weighted-average-is-consistent-estimate%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown





















































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown

































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown







          Popular posts from this blog

          MongoDB - Not Authorized To Execute Command

          How to fix TextFormField cause rebuild widget in Flutter

          in spring boot 2.1 many test slices are not allowed anymore due to multiple @BootstrapWith