Variance of ratio of mean value of functions of a random variable












0












$begingroup$


The problem:



I have a random process in which the outcomes of the real valued random variable takes the independent values $x_1, x_2, ...,x_n$. Then I defined $Q$ as



$$ Q = frac{n^{-1}sum^n f(x_i)}{n^{-1}sum^n g(x_i)}$$



Where $f$ and $g$ are known functions of the $x_i$. Note that $x_1, x_2$ etc. take the same values in both, the numerator and denominator.



My goal is obtain an expresion for $operatorname{var}left[Qright]$.





My efforts:



I tried the equation



$$operatorname{var}left[frac{X}{Y}right]approxfrac{operatorname{var}left[Xright]}{operatorname{E}left[Yright]^2}-frac{2operatorname{E}left[Xright]}{operatorname{E}left[Yright]^3}operatorname{cov}left[X,Yright]+frac{operatorname{E}left[Xright]^2}{operatorname{E}left[Yright]^4}operatorname{var}left[Yright]$$



from this article, considering $X$ and $Y$ as the numerator and denominator, respectivelly.



I considered: $E[X]= E[f(x_i)]$ and $operatorname{var}left[Xright]=operatorname{var}left[f(x_i)right]/n$.





Specific doubts:




  • Is this the right procedure?


  • How to evaluate $operatorname{cov}(X,Y)$? ( I thought that $operatorname{cov}(X,Y)=operatorname{E}left[XYright]-operatorname{E}left[Xright]operatorname{E}left[Yright]$) but I am unure because the next point and because the variance should be divided by $n$.


  • How the fact that $x_1, x_2,...$ are the same for numerator and denominator influence in this analysis?



Thank you very much.










share|cite|improve this question











$endgroup$












  • $begingroup$
    Are you looking for an exact form for the variance or an approximation?
    $endgroup$
    – Math1000
    Dec 21 '18 at 21:27










  • $begingroup$
    @Math1000 Thank you for reply. An approximation may be enought. My final interest is to known approximately how var[$Q$] is related to the variance of the x_i's and other parameter included in $f$ and $g$.
    $endgroup$
    – user1420303
    Dec 21 '18 at 21:38
















0












$begingroup$


The problem:



I have a random process in which the outcomes of the real valued random variable takes the independent values $x_1, x_2, ...,x_n$. Then I defined $Q$ as



$$ Q = frac{n^{-1}sum^n f(x_i)}{n^{-1}sum^n g(x_i)}$$



Where $f$ and $g$ are known functions of the $x_i$. Note that $x_1, x_2$ etc. take the same values in both, the numerator and denominator.



My goal is obtain an expresion for $operatorname{var}left[Qright]$.





My efforts:



I tried the equation



$$operatorname{var}left[frac{X}{Y}right]approxfrac{operatorname{var}left[Xright]}{operatorname{E}left[Yright]^2}-frac{2operatorname{E}left[Xright]}{operatorname{E}left[Yright]^3}operatorname{cov}left[X,Yright]+frac{operatorname{E}left[Xright]^2}{operatorname{E}left[Yright]^4}operatorname{var}left[Yright]$$



from this article, considering $X$ and $Y$ as the numerator and denominator, respectivelly.



I considered: $E[X]= E[f(x_i)]$ and $operatorname{var}left[Xright]=operatorname{var}left[f(x_i)right]/n$.





Specific doubts:




  • Is this the right procedure?


  • How to evaluate $operatorname{cov}(X,Y)$? ( I thought that $operatorname{cov}(X,Y)=operatorname{E}left[XYright]-operatorname{E}left[Xright]operatorname{E}left[Yright]$) but I am unure because the next point and because the variance should be divided by $n$.


  • How the fact that $x_1, x_2,...$ are the same for numerator and denominator influence in this analysis?



Thank you very much.










share|cite|improve this question











$endgroup$












  • $begingroup$
    Are you looking for an exact form for the variance or an approximation?
    $endgroup$
    – Math1000
    Dec 21 '18 at 21:27










  • $begingroup$
    @Math1000 Thank you for reply. An approximation may be enought. My final interest is to known approximately how var[$Q$] is related to the variance of the x_i's and other parameter included in $f$ and $g$.
    $endgroup$
    – user1420303
    Dec 21 '18 at 21:38














0












0








0


2



$begingroup$


The problem:



I have a random process in which the outcomes of the real valued random variable takes the independent values $x_1, x_2, ...,x_n$. Then I defined $Q$ as



$$ Q = frac{n^{-1}sum^n f(x_i)}{n^{-1}sum^n g(x_i)}$$



Where $f$ and $g$ are known functions of the $x_i$. Note that $x_1, x_2$ etc. take the same values in both, the numerator and denominator.



My goal is obtain an expresion for $operatorname{var}left[Qright]$.





My efforts:



I tried the equation



$$operatorname{var}left[frac{X}{Y}right]approxfrac{operatorname{var}left[Xright]}{operatorname{E}left[Yright]^2}-frac{2operatorname{E}left[Xright]}{operatorname{E}left[Yright]^3}operatorname{cov}left[X,Yright]+frac{operatorname{E}left[Xright]^2}{operatorname{E}left[Yright]^4}operatorname{var}left[Yright]$$



from this article, considering $X$ and $Y$ as the numerator and denominator, respectivelly.



I considered: $E[X]= E[f(x_i)]$ and $operatorname{var}left[Xright]=operatorname{var}left[f(x_i)right]/n$.





Specific doubts:




  • Is this the right procedure?


  • How to evaluate $operatorname{cov}(X,Y)$? ( I thought that $operatorname{cov}(X,Y)=operatorname{E}left[XYright]-operatorname{E}left[Xright]operatorname{E}left[Yright]$) but I am unure because the next point and because the variance should be divided by $n$.


  • How the fact that $x_1, x_2,...$ are the same for numerator and denominator influence in this analysis?



Thank you very much.










share|cite|improve this question











$endgroup$




The problem:



I have a random process in which the outcomes of the real valued random variable takes the independent values $x_1, x_2, ...,x_n$. Then I defined $Q$ as



$$ Q = frac{n^{-1}sum^n f(x_i)}{n^{-1}sum^n g(x_i)}$$



Where $f$ and $g$ are known functions of the $x_i$. Note that $x_1, x_2$ etc. take the same values in both, the numerator and denominator.



My goal is obtain an expresion for $operatorname{var}left[Qright]$.





My efforts:



I tried the equation



$$operatorname{var}left[frac{X}{Y}right]approxfrac{operatorname{var}left[Xright]}{operatorname{E}left[Yright]^2}-frac{2operatorname{E}left[Xright]}{operatorname{E}left[Yright]^3}operatorname{cov}left[X,Yright]+frac{operatorname{E}left[Xright]^2}{operatorname{E}left[Yright]^4}operatorname{var}left[Yright]$$



from this article, considering $X$ and $Y$ as the numerator and denominator, respectivelly.



I considered: $E[X]= E[f(x_i)]$ and $operatorname{var}left[Xright]=operatorname{var}left[f(x_i)right]/n$.





Specific doubts:




  • Is this the right procedure?


  • How to evaluate $operatorname{cov}(X,Y)$? ( I thought that $operatorname{cov}(X,Y)=operatorname{E}left[XYright]-operatorname{E}left[Xright]operatorname{E}left[Yright]$) but I am unure because the next point and because the variance should be divided by $n$.


  • How the fact that $x_1, x_2,...$ are the same for numerator and denominator influence in this analysis?



Thank you very much.







covariance variance expected-value






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Dec 21 '18 at 23:39







user1420303

















asked Dec 21 '18 at 21:07









user1420303user1420303

1014




1014












  • $begingroup$
    Are you looking for an exact form for the variance or an approximation?
    $endgroup$
    – Math1000
    Dec 21 '18 at 21:27










  • $begingroup$
    @Math1000 Thank you for reply. An approximation may be enought. My final interest is to known approximately how var[$Q$] is related to the variance of the x_i's and other parameter included in $f$ and $g$.
    $endgroup$
    – user1420303
    Dec 21 '18 at 21:38


















  • $begingroup$
    Are you looking for an exact form for the variance or an approximation?
    $endgroup$
    – Math1000
    Dec 21 '18 at 21:27










  • $begingroup$
    @Math1000 Thank you for reply. An approximation may be enought. My final interest is to known approximately how var[$Q$] is related to the variance of the x_i's and other parameter included in $f$ and $g$.
    $endgroup$
    – user1420303
    Dec 21 '18 at 21:38
















$begingroup$
Are you looking for an exact form for the variance or an approximation?
$endgroup$
– Math1000
Dec 21 '18 at 21:27




$begingroup$
Are you looking for an exact form for the variance or an approximation?
$endgroup$
– Math1000
Dec 21 '18 at 21:27












$begingroup$
@Math1000 Thank you for reply. An approximation may be enought. My final interest is to known approximately how var[$Q$] is related to the variance of the x_i's and other parameter included in $f$ and $g$.
$endgroup$
– user1420303
Dec 21 '18 at 21:38




$begingroup$
@Math1000 Thank you for reply. An approximation may be enought. My final interest is to known approximately how var[$Q$] is related to the variance of the x_i's and other parameter included in $f$ and $g$.
$endgroup$
– user1420303
Dec 21 '18 at 21:38










1 Answer
1






active

oldest

votes


















1












$begingroup$

I actually have the exact same problem and I had been somewhat confused about the difference between (what I perceived to be) two separate situations:




  1. The expectation value and variance of the ratio of two random variables:


$$text{E}left[frac{A}{B}right] quadtext{and}quad text{Var}left[frac{A}{B}right]$$




  1. The variance of an estimator constructed by taking the ratio of the sample means of two random variables:


$$Q = frac{A}{B} = frac{frac{1}{N}sum_{i=1}^N a(x_i)}{frac{1}{N}sum_{i=1}^N b(x_i)} quadlongrightarrowquad text{Var}left[Qright]$$





In the first case, there is a known formula, as you have already pointed out. Yet, I can relate to your lingering doubts as to whether the given formula applies to the second case as it feels somehow different. However, after some thinking, I believe the formula also applies to the second scenario. My reasoning goes as follows:



In the case where $Q = A/B$ such that $A$ and $B$ are estimates of the respective sample means of $a(x)$ and $b(x)$, $A$ and $B$ are themselves random variables, each having an associated error. Thus, even though we're taking the ratio of the (sample) means of two variables, the sample means are themselves random variables, implying that $text{Var}[Q]$ really can be computed using the formula for $text{Var}[A/B]$ as in the first case.



This reasoning becomes even clearer when considering the exercise of error propagation. Since $A$ and $B$ are estimates of some fluctuating quantities, they have each have an associated error. To find the variance, then, of a function, $Q$, defined as their ratio (as in the case we're considering), we can use error propagation to find how the errors in estimating $A$ and $B$ propagate to the error in $Q$. Indeed, the same formula for the ratio of two random variables, $text{Var}[f] = text{Var}[A/B]$, is provided in a table of formulas in the Wiki page on error propagation.



Though I've been using this formula for my own work, hopefully someone else can weigh in about the soundness of this reasoning and any caveats.





Note: Depending on the degree of nonlinearity in $a(x)$ and $b(x)$, the error estimate (for the ratio of their sample averages) will be biased. There are probably other caveats that I'm not aware of, but I hope this boosts your confidence in your use of the formula.






share|cite|improve this answer









$endgroup$













    Your Answer





    StackExchange.ifUsing("editor", function () {
    return StackExchange.using("mathjaxEditing", function () {
    StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
    StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
    });
    });
    }, "mathjax-editing");

    StackExchange.ready(function() {
    var channelOptions = {
    tags: "".split(" "),
    id: "69"
    };
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function() {
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled) {
    StackExchange.using("snippets", function() {
    createEditor();
    });
    }
    else {
    createEditor();
    }
    });

    function createEditor() {
    StackExchange.prepareEditor({
    heartbeatType: 'answer',
    autoActivateHeartbeat: false,
    convertImagesToLinks: true,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    imageUploader: {
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    },
    noCode: true, onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    });


    }
    });














    draft saved

    draft discarded


















    StackExchange.ready(
    function () {
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3048893%2fvariance-of-ratio-of-mean-value-of-functions-of-a-random-variable%23new-answer', 'question_page');
    }
    );

    Post as a guest















    Required, but never shown

























    1 Answer
    1






    active

    oldest

    votes








    1 Answer
    1






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    1












    $begingroup$

    I actually have the exact same problem and I had been somewhat confused about the difference between (what I perceived to be) two separate situations:




    1. The expectation value and variance of the ratio of two random variables:


    $$text{E}left[frac{A}{B}right] quadtext{and}quad text{Var}left[frac{A}{B}right]$$




    1. The variance of an estimator constructed by taking the ratio of the sample means of two random variables:


    $$Q = frac{A}{B} = frac{frac{1}{N}sum_{i=1}^N a(x_i)}{frac{1}{N}sum_{i=1}^N b(x_i)} quadlongrightarrowquad text{Var}left[Qright]$$





    In the first case, there is a known formula, as you have already pointed out. Yet, I can relate to your lingering doubts as to whether the given formula applies to the second case as it feels somehow different. However, after some thinking, I believe the formula also applies to the second scenario. My reasoning goes as follows:



    In the case where $Q = A/B$ such that $A$ and $B$ are estimates of the respective sample means of $a(x)$ and $b(x)$, $A$ and $B$ are themselves random variables, each having an associated error. Thus, even though we're taking the ratio of the (sample) means of two variables, the sample means are themselves random variables, implying that $text{Var}[Q]$ really can be computed using the formula for $text{Var}[A/B]$ as in the first case.



    This reasoning becomes even clearer when considering the exercise of error propagation. Since $A$ and $B$ are estimates of some fluctuating quantities, they have each have an associated error. To find the variance, then, of a function, $Q$, defined as their ratio (as in the case we're considering), we can use error propagation to find how the errors in estimating $A$ and $B$ propagate to the error in $Q$. Indeed, the same formula for the ratio of two random variables, $text{Var}[f] = text{Var}[A/B]$, is provided in a table of formulas in the Wiki page on error propagation.



    Though I've been using this formula for my own work, hopefully someone else can weigh in about the soundness of this reasoning and any caveats.





    Note: Depending on the degree of nonlinearity in $a(x)$ and $b(x)$, the error estimate (for the ratio of their sample averages) will be biased. There are probably other caveats that I'm not aware of, but I hope this boosts your confidence in your use of the formula.






    share|cite|improve this answer









    $endgroup$


















      1












      $begingroup$

      I actually have the exact same problem and I had been somewhat confused about the difference between (what I perceived to be) two separate situations:




      1. The expectation value and variance of the ratio of two random variables:


      $$text{E}left[frac{A}{B}right] quadtext{and}quad text{Var}left[frac{A}{B}right]$$




      1. The variance of an estimator constructed by taking the ratio of the sample means of two random variables:


      $$Q = frac{A}{B} = frac{frac{1}{N}sum_{i=1}^N a(x_i)}{frac{1}{N}sum_{i=1}^N b(x_i)} quadlongrightarrowquad text{Var}left[Qright]$$





      In the first case, there is a known formula, as you have already pointed out. Yet, I can relate to your lingering doubts as to whether the given formula applies to the second case as it feels somehow different. However, after some thinking, I believe the formula also applies to the second scenario. My reasoning goes as follows:



      In the case where $Q = A/B$ such that $A$ and $B$ are estimates of the respective sample means of $a(x)$ and $b(x)$, $A$ and $B$ are themselves random variables, each having an associated error. Thus, even though we're taking the ratio of the (sample) means of two variables, the sample means are themselves random variables, implying that $text{Var}[Q]$ really can be computed using the formula for $text{Var}[A/B]$ as in the first case.



      This reasoning becomes even clearer when considering the exercise of error propagation. Since $A$ and $B$ are estimates of some fluctuating quantities, they have each have an associated error. To find the variance, then, of a function, $Q$, defined as their ratio (as in the case we're considering), we can use error propagation to find how the errors in estimating $A$ and $B$ propagate to the error in $Q$. Indeed, the same formula for the ratio of two random variables, $text{Var}[f] = text{Var}[A/B]$, is provided in a table of formulas in the Wiki page on error propagation.



      Though I've been using this formula for my own work, hopefully someone else can weigh in about the soundness of this reasoning and any caveats.





      Note: Depending on the degree of nonlinearity in $a(x)$ and $b(x)$, the error estimate (for the ratio of their sample averages) will be biased. There are probably other caveats that I'm not aware of, but I hope this boosts your confidence in your use of the formula.






      share|cite|improve this answer









      $endgroup$
















        1












        1








        1





        $begingroup$

        I actually have the exact same problem and I had been somewhat confused about the difference between (what I perceived to be) two separate situations:




        1. The expectation value and variance of the ratio of two random variables:


        $$text{E}left[frac{A}{B}right] quadtext{and}quad text{Var}left[frac{A}{B}right]$$




        1. The variance of an estimator constructed by taking the ratio of the sample means of two random variables:


        $$Q = frac{A}{B} = frac{frac{1}{N}sum_{i=1}^N a(x_i)}{frac{1}{N}sum_{i=1}^N b(x_i)} quadlongrightarrowquad text{Var}left[Qright]$$





        In the first case, there is a known formula, as you have already pointed out. Yet, I can relate to your lingering doubts as to whether the given formula applies to the second case as it feels somehow different. However, after some thinking, I believe the formula also applies to the second scenario. My reasoning goes as follows:



        In the case where $Q = A/B$ such that $A$ and $B$ are estimates of the respective sample means of $a(x)$ and $b(x)$, $A$ and $B$ are themselves random variables, each having an associated error. Thus, even though we're taking the ratio of the (sample) means of two variables, the sample means are themselves random variables, implying that $text{Var}[Q]$ really can be computed using the formula for $text{Var}[A/B]$ as in the first case.



        This reasoning becomes even clearer when considering the exercise of error propagation. Since $A$ and $B$ are estimates of some fluctuating quantities, they have each have an associated error. To find the variance, then, of a function, $Q$, defined as their ratio (as in the case we're considering), we can use error propagation to find how the errors in estimating $A$ and $B$ propagate to the error in $Q$. Indeed, the same formula for the ratio of two random variables, $text{Var}[f] = text{Var}[A/B]$, is provided in a table of formulas in the Wiki page on error propagation.



        Though I've been using this formula for my own work, hopefully someone else can weigh in about the soundness of this reasoning and any caveats.





        Note: Depending on the degree of nonlinearity in $a(x)$ and $b(x)$, the error estimate (for the ratio of their sample averages) will be biased. There are probably other caveats that I'm not aware of, but I hope this boosts your confidence in your use of the formula.






        share|cite|improve this answer









        $endgroup$



        I actually have the exact same problem and I had been somewhat confused about the difference between (what I perceived to be) two separate situations:




        1. The expectation value and variance of the ratio of two random variables:


        $$text{E}left[frac{A}{B}right] quadtext{and}quad text{Var}left[frac{A}{B}right]$$




        1. The variance of an estimator constructed by taking the ratio of the sample means of two random variables:


        $$Q = frac{A}{B} = frac{frac{1}{N}sum_{i=1}^N a(x_i)}{frac{1}{N}sum_{i=1}^N b(x_i)} quadlongrightarrowquad text{Var}left[Qright]$$





        In the first case, there is a known formula, as you have already pointed out. Yet, I can relate to your lingering doubts as to whether the given formula applies to the second case as it feels somehow different. However, after some thinking, I believe the formula also applies to the second scenario. My reasoning goes as follows:



        In the case where $Q = A/B$ such that $A$ and $B$ are estimates of the respective sample means of $a(x)$ and $b(x)$, $A$ and $B$ are themselves random variables, each having an associated error. Thus, even though we're taking the ratio of the (sample) means of two variables, the sample means are themselves random variables, implying that $text{Var}[Q]$ really can be computed using the formula for $text{Var}[A/B]$ as in the first case.



        This reasoning becomes even clearer when considering the exercise of error propagation. Since $A$ and $B$ are estimates of some fluctuating quantities, they have each have an associated error. To find the variance, then, of a function, $Q$, defined as their ratio (as in the case we're considering), we can use error propagation to find how the errors in estimating $A$ and $B$ propagate to the error in $Q$. Indeed, the same formula for the ratio of two random variables, $text{Var}[f] = text{Var}[A/B]$, is provided in a table of formulas in the Wiki page on error propagation.



        Though I've been using this formula for my own work, hopefully someone else can weigh in about the soundness of this reasoning and any caveats.





        Note: Depending on the degree of nonlinearity in $a(x)$ and $b(x)$, the error estimate (for the ratio of their sample averages) will be biased. There are probably other caveats that I'm not aware of, but I hope this boosts your confidence in your use of the formula.







        share|cite|improve this answer












        share|cite|improve this answer



        share|cite|improve this answer










        answered Jan 23 at 18:57









        seylermoonseylermoon

        111




        111






























            draft saved

            draft discarded




















































            Thanks for contributing an answer to Mathematics Stack Exchange!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            Use MathJax to format equations. MathJax reference.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3048893%2fvariance-of-ratio-of-mean-value-of-functions-of-a-random-variable%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            android studio warns about leanback feature tag usage required on manifest while using Unity exported app?

            SQL update select statement

            'app-layout' is not a known element: how to share Component with different Modules