Covariance of polynomials of random normal variables












2












$begingroup$


$newcommand{Cov}{operatorname{Cov}}$If $X$ and $Y$ are random variables with a bivariate normal distribution and:




  • $Xsimmathcal{N}(mu_X,sigma_X^2)$

  • $Ysimmathcal{N}(mu_Y,sigma_Y^2)$

  • $Cov(X,Y)neq0$


May I compute $Cov(X^m,Y^n)$ for arbitrary positive integers $m$ and $n$?










share|cite|improve this question











$endgroup$








  • 1




    $begingroup$
    I don't think you can compute this object for arbitrary m and n. You would need to know more information about the random variables X, Y in the form of their joint distribution. Even if you assume that the covariance is well known, you still can't find the higher moments of the distribution.
    $endgroup$
    – DinosaurEgg
    Aug 7 '18 at 22:32










  • $begingroup$
    If I assume instead that $(X,Y)$ have a bivariate normal distribution whose variance-covariance matrix is known, does the question become answerable?
    $endgroup$
    – Mathieu
    Aug 7 '18 at 22:52










  • $begingroup$
    This question fails to state that $X,Y$ are JOINTLY normal. If that is assumed, then, since their covariance is $0,$ they are independent.
    $endgroup$
    – Michael Hardy
    Aug 8 '18 at 0:32










  • $begingroup$
    It is sloppy notation to write $Xsim N(mu_x, sigma^2_x)$ instead of $Xsim N(mu_X, sigma^2_X).$ And in many contexts when working with only slightly more involved problems of this kind, this sort of confusion can paralyze you.
    $endgroup$
    – Michael Hardy
    Aug 8 '18 at 0:37










  • $begingroup$
    In your comment you say $X,Y$ have a bivariate normal distribution, but in your question you say only that each one separately is normally distributed and their covariance is $0.$ I can show you examples of a distribution of a pair $(X,Y)$ in which each is normally distributed and their covariance is $0$ and they are NOT JOINTLY normally distributed. But if we assume bivariate (thus joint) normality, then the answer is easy. See my answer below.
    $endgroup$
    – Michael Hardy
    Aug 8 '18 at 0:44
















2












$begingroup$


$newcommand{Cov}{operatorname{Cov}}$If $X$ and $Y$ are random variables with a bivariate normal distribution and:




  • $Xsimmathcal{N}(mu_X,sigma_X^2)$

  • $Ysimmathcal{N}(mu_Y,sigma_Y^2)$

  • $Cov(X,Y)neq0$


May I compute $Cov(X^m,Y^n)$ for arbitrary positive integers $m$ and $n$?










share|cite|improve this question











$endgroup$








  • 1




    $begingroup$
    I don't think you can compute this object for arbitrary m and n. You would need to know more information about the random variables X, Y in the form of their joint distribution. Even if you assume that the covariance is well known, you still can't find the higher moments of the distribution.
    $endgroup$
    – DinosaurEgg
    Aug 7 '18 at 22:32










  • $begingroup$
    If I assume instead that $(X,Y)$ have a bivariate normal distribution whose variance-covariance matrix is known, does the question become answerable?
    $endgroup$
    – Mathieu
    Aug 7 '18 at 22:52










  • $begingroup$
    This question fails to state that $X,Y$ are JOINTLY normal. If that is assumed, then, since their covariance is $0,$ they are independent.
    $endgroup$
    – Michael Hardy
    Aug 8 '18 at 0:32










  • $begingroup$
    It is sloppy notation to write $Xsim N(mu_x, sigma^2_x)$ instead of $Xsim N(mu_X, sigma^2_X).$ And in many contexts when working with only slightly more involved problems of this kind, this sort of confusion can paralyze you.
    $endgroup$
    – Michael Hardy
    Aug 8 '18 at 0:37










  • $begingroup$
    In your comment you say $X,Y$ have a bivariate normal distribution, but in your question you say only that each one separately is normally distributed and their covariance is $0.$ I can show you examples of a distribution of a pair $(X,Y)$ in which each is normally distributed and their covariance is $0$ and they are NOT JOINTLY normally distributed. But if we assume bivariate (thus joint) normality, then the answer is easy. See my answer below.
    $endgroup$
    – Michael Hardy
    Aug 8 '18 at 0:44














2












2








2





$begingroup$


$newcommand{Cov}{operatorname{Cov}}$If $X$ and $Y$ are random variables with a bivariate normal distribution and:




  • $Xsimmathcal{N}(mu_X,sigma_X^2)$

  • $Ysimmathcal{N}(mu_Y,sigma_Y^2)$

  • $Cov(X,Y)neq0$


May I compute $Cov(X^m,Y^n)$ for arbitrary positive integers $m$ and $n$?










share|cite|improve this question











$endgroup$




$newcommand{Cov}{operatorname{Cov}}$If $X$ and $Y$ are random variables with a bivariate normal distribution and:




  • $Xsimmathcal{N}(mu_X,sigma_X^2)$

  • $Ysimmathcal{N}(mu_Y,sigma_Y^2)$

  • $Cov(X,Y)neq0$


May I compute $Cov(X^m,Y^n)$ for arbitrary positive integers $m$ and $n$?







probability statistics random-variables normal-distribution covariance






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Aug 8 '18 at 6:03







Mathieu

















asked Aug 7 '18 at 22:13









MathieuMathieu

1135




1135








  • 1




    $begingroup$
    I don't think you can compute this object for arbitrary m and n. You would need to know more information about the random variables X, Y in the form of their joint distribution. Even if you assume that the covariance is well known, you still can't find the higher moments of the distribution.
    $endgroup$
    – DinosaurEgg
    Aug 7 '18 at 22:32










  • $begingroup$
    If I assume instead that $(X,Y)$ have a bivariate normal distribution whose variance-covariance matrix is known, does the question become answerable?
    $endgroup$
    – Mathieu
    Aug 7 '18 at 22:52










  • $begingroup$
    This question fails to state that $X,Y$ are JOINTLY normal. If that is assumed, then, since their covariance is $0,$ they are independent.
    $endgroup$
    – Michael Hardy
    Aug 8 '18 at 0:32










  • $begingroup$
    It is sloppy notation to write $Xsim N(mu_x, sigma^2_x)$ instead of $Xsim N(mu_X, sigma^2_X).$ And in many contexts when working with only slightly more involved problems of this kind, this sort of confusion can paralyze you.
    $endgroup$
    – Michael Hardy
    Aug 8 '18 at 0:37










  • $begingroup$
    In your comment you say $X,Y$ have a bivariate normal distribution, but in your question you say only that each one separately is normally distributed and their covariance is $0.$ I can show you examples of a distribution of a pair $(X,Y)$ in which each is normally distributed and their covariance is $0$ and they are NOT JOINTLY normally distributed. But if we assume bivariate (thus joint) normality, then the answer is easy. See my answer below.
    $endgroup$
    – Michael Hardy
    Aug 8 '18 at 0:44














  • 1




    $begingroup$
    I don't think you can compute this object for arbitrary m and n. You would need to know more information about the random variables X, Y in the form of their joint distribution. Even if you assume that the covariance is well known, you still can't find the higher moments of the distribution.
    $endgroup$
    – DinosaurEgg
    Aug 7 '18 at 22:32










  • $begingroup$
    If I assume instead that $(X,Y)$ have a bivariate normal distribution whose variance-covariance matrix is known, does the question become answerable?
    $endgroup$
    – Mathieu
    Aug 7 '18 at 22:52










  • $begingroup$
    This question fails to state that $X,Y$ are JOINTLY normal. If that is assumed, then, since their covariance is $0,$ they are independent.
    $endgroup$
    – Michael Hardy
    Aug 8 '18 at 0:32










  • $begingroup$
    It is sloppy notation to write $Xsim N(mu_x, sigma^2_x)$ instead of $Xsim N(mu_X, sigma^2_X).$ And in many contexts when working with only slightly more involved problems of this kind, this sort of confusion can paralyze you.
    $endgroup$
    – Michael Hardy
    Aug 8 '18 at 0:37










  • $begingroup$
    In your comment you say $X,Y$ have a bivariate normal distribution, but in your question you say only that each one separately is normally distributed and their covariance is $0.$ I can show you examples of a distribution of a pair $(X,Y)$ in which each is normally distributed and their covariance is $0$ and they are NOT JOINTLY normally distributed. But if we assume bivariate (thus joint) normality, then the answer is easy. See my answer below.
    $endgroup$
    – Michael Hardy
    Aug 8 '18 at 0:44








1




1




$begingroup$
I don't think you can compute this object for arbitrary m and n. You would need to know more information about the random variables X, Y in the form of their joint distribution. Even if you assume that the covariance is well known, you still can't find the higher moments of the distribution.
$endgroup$
– DinosaurEgg
Aug 7 '18 at 22:32




$begingroup$
I don't think you can compute this object for arbitrary m and n. You would need to know more information about the random variables X, Y in the form of their joint distribution. Even if you assume that the covariance is well known, you still can't find the higher moments of the distribution.
$endgroup$
– DinosaurEgg
Aug 7 '18 at 22:32












$begingroup$
If I assume instead that $(X,Y)$ have a bivariate normal distribution whose variance-covariance matrix is known, does the question become answerable?
$endgroup$
– Mathieu
Aug 7 '18 at 22:52




$begingroup$
If I assume instead that $(X,Y)$ have a bivariate normal distribution whose variance-covariance matrix is known, does the question become answerable?
$endgroup$
– Mathieu
Aug 7 '18 at 22:52












$begingroup$
This question fails to state that $X,Y$ are JOINTLY normal. If that is assumed, then, since their covariance is $0,$ they are independent.
$endgroup$
– Michael Hardy
Aug 8 '18 at 0:32




$begingroup$
This question fails to state that $X,Y$ are JOINTLY normal. If that is assumed, then, since their covariance is $0,$ they are independent.
$endgroup$
– Michael Hardy
Aug 8 '18 at 0:32












$begingroup$
It is sloppy notation to write $Xsim N(mu_x, sigma^2_x)$ instead of $Xsim N(mu_X, sigma^2_X).$ And in many contexts when working with only slightly more involved problems of this kind, this sort of confusion can paralyze you.
$endgroup$
– Michael Hardy
Aug 8 '18 at 0:37




$begingroup$
It is sloppy notation to write $Xsim N(mu_x, sigma^2_x)$ instead of $Xsim N(mu_X, sigma^2_X).$ And in many contexts when working with only slightly more involved problems of this kind, this sort of confusion can paralyze you.
$endgroup$
– Michael Hardy
Aug 8 '18 at 0:37












$begingroup$
In your comment you say $X,Y$ have a bivariate normal distribution, but in your question you say only that each one separately is normally distributed and their covariance is $0.$ I can show you examples of a distribution of a pair $(X,Y)$ in which each is normally distributed and their covariance is $0$ and they are NOT JOINTLY normally distributed. But if we assume bivariate (thus joint) normality, then the answer is easy. See my answer below.
$endgroup$
– Michael Hardy
Aug 8 '18 at 0:44




$begingroup$
In your comment you say $X,Y$ have a bivariate normal distribution, but in your question you say only that each one separately is normally distributed and their covariance is $0.$ I can show you examples of a distribution of a pair $(X,Y)$ in which each is normally distributed and their covariance is $0$ and they are NOT JOINTLY normally distributed. But if we assume bivariate (thus joint) normality, then the answer is easy. See my answer below.
$endgroup$
– Michael Hardy
Aug 8 '18 at 0:44










3 Answers
3






active

oldest

votes


















2












$begingroup$

$newcommand{Cov}{operatorname{Cov}}$Given the new background that the OP provided to his question (most general normal bivariate distribution for variables (X_1, X_2) with non-zero covariance), it is possible to find a general formula for $Cov(X_1^m, X_2^n)$ as follows:



Consider the generating function $phi(t_1,t_2)$ for the bivariate distribution given here, in equation (57) (a clean derivation is given for it so I won't repeat it). Then:



$$E(X_1^m X_2^n)=(-i)^{m+n}(frac{partial}{partial t_1})^m(frac{partial}{partial t_2})^nphi(t_1,t_2)Big|_{(t_1, t_2)=(0,0)} tag 1$$



After some tedious and careful algebra we wish to write the generating function $phi(t_1,t_2)=e^{it_1mu_1+it_2mu_2}e^{-frac{1}{2}(sigma_1^2t_1^2+2rhosigma_1t_1sigma_2t_2+sigma_2^2t_2^2)}$ in the form:



$$phi(t_1,t_2)=e^{-a}e^{-b(t_2+c)^2}e^{-d(t_1+gt_2+h)^2}tag2$$



which is possible for the values:



$$a=frac{mu_1^2}{2sigma_1^2}+frac{(mu_2-mu_1rho frac{sigma_2}{sigma_1})^2}{2sigma^2_2(1-rho^2)}, hspace{0.2cm}b=frac{1}{2}(1-rho^2)sigma_{2}^2, hspace{0.2cm}c=-ifrac{mu_2-mu_1rho frac{sigma_2}{sigma_1}}{2sigma^2_2(1-rho^2)}, hspace{0.2cm}d=frac{sigma_1^2}{2},hspace{0.2cm} g=frac{rhosigma_2}{sigma_1},hspace{0.2cm}h=-ifrac{mu_1}{sigma_1^2} $$



We change variables
$y_2=sqrt{b}(t_2+c), y_1=sqrt{d}(t_1+gt_2+h)$ and we find:



$$begin{align}frac{partial}{partial t_1}&=sqrt{d}frac{partial}{partial y_1}\ frac{partial}{partial t_2}&=gsqrt{d}frac{partial}{partial y_1}+sqrt{b}frac{partial}{partial y_2}end{align}$$



Substitute into (1) for the result:



$$E(X_1^mX_2^n)=(-i)^{m+n}(sqrt{d}frac{partial}{partial y_1})^m(gsqrt{d}frac{partial}{partial y_1}+sqrt{b}frac{partial}{partial y_2})^n e^{-a}e^{-y_1^2}e^{-y^2_2}Big|_{(y_1, y_2)=(csqrt{b},hsqrt{d})}$$



Expanding the parentheses and using the Rodrigues formula for Hermite polynomials we get:



$$E(X_1^mX_2^n)=e^{-a-bc^2-dh^2}i^{m+n}g^n d^{frac{m+n}{2}}sum_{k=0}^nfrac{n!}{k!(n-k)!}Big(frac{sqrt{b}}{gsqrt{d}}Big)^{n-k}H_{n-k}(hsqrt{d})H_{m+k}(csqrt{b})$$



The expectation values $E(X_1^m)$ and $E(X_2^n)$ can be calculated by setting $n=0$, $m=0$ in the general formula respectively.






share|cite|improve this answer











$endgroup$













  • $begingroup$
    This makes the problem far more complicated than it is, unless you don't know that JOINT normality plus uncorrelatedness entails independence.
    $endgroup$
    – Michael Hardy
    Aug 9 '18 at 2:08










  • $begingroup$
    The OP has stated that the two variables are correlated since their covariance is non-zero. The OP also updated their request with a joint probability that reflects this. This problem is valid and I answered with a calculation that addresses it. The formula is simple and a finite polynomial in all variables except for $rho$ so I dont see how this problem is complicated.
    $endgroup$
    – DinosaurEgg
    Aug 9 '18 at 3:43










  • $begingroup$
    I don't think the OP gave a good description of what the problem was in it's initial wording, but then they changed their mind. It seems to me that to them quadratic correlations are more important than solving the totally trivial independent variable problem.
    $endgroup$
    – DinosaurEgg
    Aug 9 '18 at 3:48



















-1












$begingroup$

I do not think that there is a nice answer here, sorry.$newcommand{Cov}{operatorname{Cov}}$



Note firstly that
$$
Cov(X^m,Y^n)=E(X^mY^n)-E(X^m)E(Y^n)
$$

by definition of the covariance.



This answer provides you with:



$$E(X^m)=sum_{k=0}^{lfloor m/2rfloor} {m choose 2k}(2k-1)!!sigma_X^{2k}mu_X^{m-2k}.$$



and similar for $Y^n$. However, evaluating $E(X^mY^n)$ is the issue. See here how messy the distribution of just two non-independent normal random variables is, yet you have the product of $m+n$ of them!






share|cite|improve this answer











$endgroup$





















    -2












    $begingroup$

    You have said$newcommand{Cov}{operatorname{Cov}}$
    begin{align}
    & Xsim N(mu_X,sigma^2_X), \[4pt]
    & Y sim N(mu_Y,sigma^2_Y), \[4pt]
    & Cov(X,Y) = 0.
    end{align}

    That falls short of specifying the joint distribution of $(X,Y).$ If it were further specified that $X,Y$ are jointly normally distributed, then the covariance can be $0$ only if $X,Y$ are independent. If $X,Y$ are independent, then so are $X^m,Y^n,$ so their covariance is also $0.$



    Here is a simple example: Suppose $Xsim N(mu_X, sigma_X^2),$ let $Z = text{the “z-score''} = (X-mu_X)/sigma_X,$ and independently of $X$ you toss a coin. Then let $Y = mu_Y pm sigma_Y Z, $ where the choice between $text{“}pmtext{''}$ is determined by the coin toss. Then $X,Y$ have covariance $0$ and have just the distributions you specified in the question, but they are NOT JOINTLY normally distributed and not independent.



    But if you assume joint normality, which in the case of two random variables means bivariate normality, then the answer is just as in the first paragraph above.






    share|cite|improve this answer











    $endgroup$













    • $begingroup$
      Thanks, but my original question states that Cov(X,Y) is not equal to zero. I edited it to specify a joint normal distribution.
      $endgroup$
      – Mathieu
      Aug 8 '18 at 6:05










    • $begingroup$
      @Mathieu : In that case, my first paragraph above answers your question, and the answer by E.Malkin misses the point at best. There is no need to know the variance of $X^m$ or even the expected value of $X^m$ in order to answer your question.
      $endgroup$
      – Michael Hardy
      Aug 8 '18 at 12:41










    • $begingroup$
      Sorry but I must be missing something: my question states that Cov(X,Y) is not zero and your first paragraph above states the opposite. Not sure what to make of that.
      $endgroup$
      – Mathieu
      Aug 8 '18 at 13:14












    • $begingroup$
      @MichaelHardy I also don't see how your first paragraph answers the question and could you please explain how I've missed the point ("at best")? Knowing $E(X^m)$ means that finding $Cov (X^m,Y^n)$ is equivalent to finding $E(X^mY^n)$, which surely isn't invalid?
      $endgroup$
      – Malkin
      Aug 8 '18 at 22:04










    • $begingroup$
      @Malkin : If $X,Y$ are not just separately normal but jointly normal, then having zero covariance entails that they are independent. If $X,Y$ are independent, then $X^m,Y^n$ are independent. If $X^m, Y^n$ are independent, then $operatorname E(X^m Y^n) = operatorname E(X^m) operatorname E(Y^n).$ Therefore $$operatorname{cov}(X^m, Y^n) = operatorname E(X^m Y^n) - operatorname E(X^m) operatorname E(Y^n) = operatorname E(X^m) operatorname E(Y^n) - operatorname E(X^m) operatorname E(Y^m) = 0.$$
      $endgroup$
      – Michael Hardy
      Aug 9 '18 at 2:04














    Your Answer





    StackExchange.ifUsing("editor", function () {
    return StackExchange.using("mathjaxEditing", function () {
    StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
    StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
    });
    });
    }, "mathjax-editing");

    StackExchange.ready(function() {
    var channelOptions = {
    tags: "".split(" "),
    id: "69"
    };
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function() {
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled) {
    StackExchange.using("snippets", function() {
    createEditor();
    });
    }
    else {
    createEditor();
    }
    });

    function createEditor() {
    StackExchange.prepareEditor({
    heartbeatType: 'answer',
    autoActivateHeartbeat: false,
    convertImagesToLinks: true,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    imageUploader: {
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    },
    noCode: true, onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    });


    }
    });














    draft saved

    draft discarded


















    StackExchange.ready(
    function () {
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2875466%2fcovariance-of-polynomials-of-random-normal-variables%23new-answer', 'question_page');
    }
    );

    Post as a guest















    Required, but never shown

























    3 Answers
    3






    active

    oldest

    votes








    3 Answers
    3






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    2












    $begingroup$

    $newcommand{Cov}{operatorname{Cov}}$Given the new background that the OP provided to his question (most general normal bivariate distribution for variables (X_1, X_2) with non-zero covariance), it is possible to find a general formula for $Cov(X_1^m, X_2^n)$ as follows:



    Consider the generating function $phi(t_1,t_2)$ for the bivariate distribution given here, in equation (57) (a clean derivation is given for it so I won't repeat it). Then:



    $$E(X_1^m X_2^n)=(-i)^{m+n}(frac{partial}{partial t_1})^m(frac{partial}{partial t_2})^nphi(t_1,t_2)Big|_{(t_1, t_2)=(0,0)} tag 1$$



    After some tedious and careful algebra we wish to write the generating function $phi(t_1,t_2)=e^{it_1mu_1+it_2mu_2}e^{-frac{1}{2}(sigma_1^2t_1^2+2rhosigma_1t_1sigma_2t_2+sigma_2^2t_2^2)}$ in the form:



    $$phi(t_1,t_2)=e^{-a}e^{-b(t_2+c)^2}e^{-d(t_1+gt_2+h)^2}tag2$$



    which is possible for the values:



    $$a=frac{mu_1^2}{2sigma_1^2}+frac{(mu_2-mu_1rho frac{sigma_2}{sigma_1})^2}{2sigma^2_2(1-rho^2)}, hspace{0.2cm}b=frac{1}{2}(1-rho^2)sigma_{2}^2, hspace{0.2cm}c=-ifrac{mu_2-mu_1rho frac{sigma_2}{sigma_1}}{2sigma^2_2(1-rho^2)}, hspace{0.2cm}d=frac{sigma_1^2}{2},hspace{0.2cm} g=frac{rhosigma_2}{sigma_1},hspace{0.2cm}h=-ifrac{mu_1}{sigma_1^2} $$



    We change variables
    $y_2=sqrt{b}(t_2+c), y_1=sqrt{d}(t_1+gt_2+h)$ and we find:



    $$begin{align}frac{partial}{partial t_1}&=sqrt{d}frac{partial}{partial y_1}\ frac{partial}{partial t_2}&=gsqrt{d}frac{partial}{partial y_1}+sqrt{b}frac{partial}{partial y_2}end{align}$$



    Substitute into (1) for the result:



    $$E(X_1^mX_2^n)=(-i)^{m+n}(sqrt{d}frac{partial}{partial y_1})^m(gsqrt{d}frac{partial}{partial y_1}+sqrt{b}frac{partial}{partial y_2})^n e^{-a}e^{-y_1^2}e^{-y^2_2}Big|_{(y_1, y_2)=(csqrt{b},hsqrt{d})}$$



    Expanding the parentheses and using the Rodrigues formula for Hermite polynomials we get:



    $$E(X_1^mX_2^n)=e^{-a-bc^2-dh^2}i^{m+n}g^n d^{frac{m+n}{2}}sum_{k=0}^nfrac{n!}{k!(n-k)!}Big(frac{sqrt{b}}{gsqrt{d}}Big)^{n-k}H_{n-k}(hsqrt{d})H_{m+k}(csqrt{b})$$



    The expectation values $E(X_1^m)$ and $E(X_2^n)$ can be calculated by setting $n=0$, $m=0$ in the general formula respectively.






    share|cite|improve this answer











    $endgroup$













    • $begingroup$
      This makes the problem far more complicated than it is, unless you don't know that JOINT normality plus uncorrelatedness entails independence.
      $endgroup$
      – Michael Hardy
      Aug 9 '18 at 2:08










    • $begingroup$
      The OP has stated that the two variables are correlated since their covariance is non-zero. The OP also updated their request with a joint probability that reflects this. This problem is valid and I answered with a calculation that addresses it. The formula is simple and a finite polynomial in all variables except for $rho$ so I dont see how this problem is complicated.
      $endgroup$
      – DinosaurEgg
      Aug 9 '18 at 3:43










    • $begingroup$
      I don't think the OP gave a good description of what the problem was in it's initial wording, but then they changed their mind. It seems to me that to them quadratic correlations are more important than solving the totally trivial independent variable problem.
      $endgroup$
      – DinosaurEgg
      Aug 9 '18 at 3:48
















    2












    $begingroup$

    $newcommand{Cov}{operatorname{Cov}}$Given the new background that the OP provided to his question (most general normal bivariate distribution for variables (X_1, X_2) with non-zero covariance), it is possible to find a general formula for $Cov(X_1^m, X_2^n)$ as follows:



    Consider the generating function $phi(t_1,t_2)$ for the bivariate distribution given here, in equation (57) (a clean derivation is given for it so I won't repeat it). Then:



    $$E(X_1^m X_2^n)=(-i)^{m+n}(frac{partial}{partial t_1})^m(frac{partial}{partial t_2})^nphi(t_1,t_2)Big|_{(t_1, t_2)=(0,0)} tag 1$$



    After some tedious and careful algebra we wish to write the generating function $phi(t_1,t_2)=e^{it_1mu_1+it_2mu_2}e^{-frac{1}{2}(sigma_1^2t_1^2+2rhosigma_1t_1sigma_2t_2+sigma_2^2t_2^2)}$ in the form:



    $$phi(t_1,t_2)=e^{-a}e^{-b(t_2+c)^2}e^{-d(t_1+gt_2+h)^2}tag2$$



    which is possible for the values:



    $$a=frac{mu_1^2}{2sigma_1^2}+frac{(mu_2-mu_1rho frac{sigma_2}{sigma_1})^2}{2sigma^2_2(1-rho^2)}, hspace{0.2cm}b=frac{1}{2}(1-rho^2)sigma_{2}^2, hspace{0.2cm}c=-ifrac{mu_2-mu_1rho frac{sigma_2}{sigma_1}}{2sigma^2_2(1-rho^2)}, hspace{0.2cm}d=frac{sigma_1^2}{2},hspace{0.2cm} g=frac{rhosigma_2}{sigma_1},hspace{0.2cm}h=-ifrac{mu_1}{sigma_1^2} $$



    We change variables
    $y_2=sqrt{b}(t_2+c), y_1=sqrt{d}(t_1+gt_2+h)$ and we find:



    $$begin{align}frac{partial}{partial t_1}&=sqrt{d}frac{partial}{partial y_1}\ frac{partial}{partial t_2}&=gsqrt{d}frac{partial}{partial y_1}+sqrt{b}frac{partial}{partial y_2}end{align}$$



    Substitute into (1) for the result:



    $$E(X_1^mX_2^n)=(-i)^{m+n}(sqrt{d}frac{partial}{partial y_1})^m(gsqrt{d}frac{partial}{partial y_1}+sqrt{b}frac{partial}{partial y_2})^n e^{-a}e^{-y_1^2}e^{-y^2_2}Big|_{(y_1, y_2)=(csqrt{b},hsqrt{d})}$$



    Expanding the parentheses and using the Rodrigues formula for Hermite polynomials we get:



    $$E(X_1^mX_2^n)=e^{-a-bc^2-dh^2}i^{m+n}g^n d^{frac{m+n}{2}}sum_{k=0}^nfrac{n!}{k!(n-k)!}Big(frac{sqrt{b}}{gsqrt{d}}Big)^{n-k}H_{n-k}(hsqrt{d})H_{m+k}(csqrt{b})$$



    The expectation values $E(X_1^m)$ and $E(X_2^n)$ can be calculated by setting $n=0$, $m=0$ in the general formula respectively.






    share|cite|improve this answer











    $endgroup$













    • $begingroup$
      This makes the problem far more complicated than it is, unless you don't know that JOINT normality plus uncorrelatedness entails independence.
      $endgroup$
      – Michael Hardy
      Aug 9 '18 at 2:08










    • $begingroup$
      The OP has stated that the two variables are correlated since their covariance is non-zero. The OP also updated their request with a joint probability that reflects this. This problem is valid and I answered with a calculation that addresses it. The formula is simple and a finite polynomial in all variables except for $rho$ so I dont see how this problem is complicated.
      $endgroup$
      – DinosaurEgg
      Aug 9 '18 at 3:43










    • $begingroup$
      I don't think the OP gave a good description of what the problem was in it's initial wording, but then they changed their mind. It seems to me that to them quadratic correlations are more important than solving the totally trivial independent variable problem.
      $endgroup$
      – DinosaurEgg
      Aug 9 '18 at 3:48














    2












    2








    2





    $begingroup$

    $newcommand{Cov}{operatorname{Cov}}$Given the new background that the OP provided to his question (most general normal bivariate distribution for variables (X_1, X_2) with non-zero covariance), it is possible to find a general formula for $Cov(X_1^m, X_2^n)$ as follows:



    Consider the generating function $phi(t_1,t_2)$ for the bivariate distribution given here, in equation (57) (a clean derivation is given for it so I won't repeat it). Then:



    $$E(X_1^m X_2^n)=(-i)^{m+n}(frac{partial}{partial t_1})^m(frac{partial}{partial t_2})^nphi(t_1,t_2)Big|_{(t_1, t_2)=(0,0)} tag 1$$



    After some tedious and careful algebra we wish to write the generating function $phi(t_1,t_2)=e^{it_1mu_1+it_2mu_2}e^{-frac{1}{2}(sigma_1^2t_1^2+2rhosigma_1t_1sigma_2t_2+sigma_2^2t_2^2)}$ in the form:



    $$phi(t_1,t_2)=e^{-a}e^{-b(t_2+c)^2}e^{-d(t_1+gt_2+h)^2}tag2$$



    which is possible for the values:



    $$a=frac{mu_1^2}{2sigma_1^2}+frac{(mu_2-mu_1rho frac{sigma_2}{sigma_1})^2}{2sigma^2_2(1-rho^2)}, hspace{0.2cm}b=frac{1}{2}(1-rho^2)sigma_{2}^2, hspace{0.2cm}c=-ifrac{mu_2-mu_1rho frac{sigma_2}{sigma_1}}{2sigma^2_2(1-rho^2)}, hspace{0.2cm}d=frac{sigma_1^2}{2},hspace{0.2cm} g=frac{rhosigma_2}{sigma_1},hspace{0.2cm}h=-ifrac{mu_1}{sigma_1^2} $$



    We change variables
    $y_2=sqrt{b}(t_2+c), y_1=sqrt{d}(t_1+gt_2+h)$ and we find:



    $$begin{align}frac{partial}{partial t_1}&=sqrt{d}frac{partial}{partial y_1}\ frac{partial}{partial t_2}&=gsqrt{d}frac{partial}{partial y_1}+sqrt{b}frac{partial}{partial y_2}end{align}$$



    Substitute into (1) for the result:



    $$E(X_1^mX_2^n)=(-i)^{m+n}(sqrt{d}frac{partial}{partial y_1})^m(gsqrt{d}frac{partial}{partial y_1}+sqrt{b}frac{partial}{partial y_2})^n e^{-a}e^{-y_1^2}e^{-y^2_2}Big|_{(y_1, y_2)=(csqrt{b},hsqrt{d})}$$



    Expanding the parentheses and using the Rodrigues formula for Hermite polynomials we get:



    $$E(X_1^mX_2^n)=e^{-a-bc^2-dh^2}i^{m+n}g^n d^{frac{m+n}{2}}sum_{k=0}^nfrac{n!}{k!(n-k)!}Big(frac{sqrt{b}}{gsqrt{d}}Big)^{n-k}H_{n-k}(hsqrt{d})H_{m+k}(csqrt{b})$$



    The expectation values $E(X_1^m)$ and $E(X_2^n)$ can be calculated by setting $n=0$, $m=0$ in the general formula respectively.






    share|cite|improve this answer











    $endgroup$



    $newcommand{Cov}{operatorname{Cov}}$Given the new background that the OP provided to his question (most general normal bivariate distribution for variables (X_1, X_2) with non-zero covariance), it is possible to find a general formula for $Cov(X_1^m, X_2^n)$ as follows:



    Consider the generating function $phi(t_1,t_2)$ for the bivariate distribution given here, in equation (57) (a clean derivation is given for it so I won't repeat it). Then:



    $$E(X_1^m X_2^n)=(-i)^{m+n}(frac{partial}{partial t_1})^m(frac{partial}{partial t_2})^nphi(t_1,t_2)Big|_{(t_1, t_2)=(0,0)} tag 1$$



    After some tedious and careful algebra we wish to write the generating function $phi(t_1,t_2)=e^{it_1mu_1+it_2mu_2}e^{-frac{1}{2}(sigma_1^2t_1^2+2rhosigma_1t_1sigma_2t_2+sigma_2^2t_2^2)}$ in the form:



    $$phi(t_1,t_2)=e^{-a}e^{-b(t_2+c)^2}e^{-d(t_1+gt_2+h)^2}tag2$$



    which is possible for the values:



    $$a=frac{mu_1^2}{2sigma_1^2}+frac{(mu_2-mu_1rho frac{sigma_2}{sigma_1})^2}{2sigma^2_2(1-rho^2)}, hspace{0.2cm}b=frac{1}{2}(1-rho^2)sigma_{2}^2, hspace{0.2cm}c=-ifrac{mu_2-mu_1rho frac{sigma_2}{sigma_1}}{2sigma^2_2(1-rho^2)}, hspace{0.2cm}d=frac{sigma_1^2}{2},hspace{0.2cm} g=frac{rhosigma_2}{sigma_1},hspace{0.2cm}h=-ifrac{mu_1}{sigma_1^2} $$



    We change variables
    $y_2=sqrt{b}(t_2+c), y_1=sqrt{d}(t_1+gt_2+h)$ and we find:



    $$begin{align}frac{partial}{partial t_1}&=sqrt{d}frac{partial}{partial y_1}\ frac{partial}{partial t_2}&=gsqrt{d}frac{partial}{partial y_1}+sqrt{b}frac{partial}{partial y_2}end{align}$$



    Substitute into (1) for the result:



    $$E(X_1^mX_2^n)=(-i)^{m+n}(sqrt{d}frac{partial}{partial y_1})^m(gsqrt{d}frac{partial}{partial y_1}+sqrt{b}frac{partial}{partial y_2})^n e^{-a}e^{-y_1^2}e^{-y^2_2}Big|_{(y_1, y_2)=(csqrt{b},hsqrt{d})}$$



    Expanding the parentheses and using the Rodrigues formula for Hermite polynomials we get:



    $$E(X_1^mX_2^n)=e^{-a-bc^2-dh^2}i^{m+n}g^n d^{frac{m+n}{2}}sum_{k=0}^nfrac{n!}{k!(n-k)!}Big(frac{sqrt{b}}{gsqrt{d}}Big)^{n-k}H_{n-k}(hsqrt{d})H_{m+k}(csqrt{b})$$



    The expectation values $E(X_1^m)$ and $E(X_2^n)$ can be calculated by setting $n=0$, $m=0$ in the general formula respectively.







    share|cite|improve this answer














    share|cite|improve this answer



    share|cite|improve this answer








    edited Jan 29 at 11:39









    Martin Sleziak

    44.9k10122277




    44.9k10122277










    answered Aug 8 '18 at 23:01









    DinosaurEggDinosaurEgg

    49017




    49017












    • $begingroup$
      This makes the problem far more complicated than it is, unless you don't know that JOINT normality plus uncorrelatedness entails independence.
      $endgroup$
      – Michael Hardy
      Aug 9 '18 at 2:08










    • $begingroup$
      The OP has stated that the two variables are correlated since their covariance is non-zero. The OP also updated their request with a joint probability that reflects this. This problem is valid and I answered with a calculation that addresses it. The formula is simple and a finite polynomial in all variables except for $rho$ so I dont see how this problem is complicated.
      $endgroup$
      – DinosaurEgg
      Aug 9 '18 at 3:43










    • $begingroup$
      I don't think the OP gave a good description of what the problem was in it's initial wording, but then they changed their mind. It seems to me that to them quadratic correlations are more important than solving the totally trivial independent variable problem.
      $endgroup$
      – DinosaurEgg
      Aug 9 '18 at 3:48


















    • $begingroup$
      This makes the problem far more complicated than it is, unless you don't know that JOINT normality plus uncorrelatedness entails independence.
      $endgroup$
      – Michael Hardy
      Aug 9 '18 at 2:08










    • $begingroup$
      The OP has stated that the two variables are correlated since their covariance is non-zero. The OP also updated their request with a joint probability that reflects this. This problem is valid and I answered with a calculation that addresses it. The formula is simple and a finite polynomial in all variables except for $rho$ so I dont see how this problem is complicated.
      $endgroup$
      – DinosaurEgg
      Aug 9 '18 at 3:43










    • $begingroup$
      I don't think the OP gave a good description of what the problem was in it's initial wording, but then they changed their mind. It seems to me that to them quadratic correlations are more important than solving the totally trivial independent variable problem.
      $endgroup$
      – DinosaurEgg
      Aug 9 '18 at 3:48
















    $begingroup$
    This makes the problem far more complicated than it is, unless you don't know that JOINT normality plus uncorrelatedness entails independence.
    $endgroup$
    – Michael Hardy
    Aug 9 '18 at 2:08




    $begingroup$
    This makes the problem far more complicated than it is, unless you don't know that JOINT normality plus uncorrelatedness entails independence.
    $endgroup$
    – Michael Hardy
    Aug 9 '18 at 2:08












    $begingroup$
    The OP has stated that the two variables are correlated since their covariance is non-zero. The OP also updated their request with a joint probability that reflects this. This problem is valid and I answered with a calculation that addresses it. The formula is simple and a finite polynomial in all variables except for $rho$ so I dont see how this problem is complicated.
    $endgroup$
    – DinosaurEgg
    Aug 9 '18 at 3:43




    $begingroup$
    The OP has stated that the two variables are correlated since their covariance is non-zero. The OP also updated their request with a joint probability that reflects this. This problem is valid and I answered with a calculation that addresses it. The formula is simple and a finite polynomial in all variables except for $rho$ so I dont see how this problem is complicated.
    $endgroup$
    – DinosaurEgg
    Aug 9 '18 at 3:43












    $begingroup$
    I don't think the OP gave a good description of what the problem was in it's initial wording, but then they changed their mind. It seems to me that to them quadratic correlations are more important than solving the totally trivial independent variable problem.
    $endgroup$
    – DinosaurEgg
    Aug 9 '18 at 3:48




    $begingroup$
    I don't think the OP gave a good description of what the problem was in it's initial wording, but then they changed their mind. It seems to me that to them quadratic correlations are more important than solving the totally trivial independent variable problem.
    $endgroup$
    – DinosaurEgg
    Aug 9 '18 at 3:48











    -1












    $begingroup$

    I do not think that there is a nice answer here, sorry.$newcommand{Cov}{operatorname{Cov}}$



    Note firstly that
    $$
    Cov(X^m,Y^n)=E(X^mY^n)-E(X^m)E(Y^n)
    $$

    by definition of the covariance.



    This answer provides you with:



    $$E(X^m)=sum_{k=0}^{lfloor m/2rfloor} {m choose 2k}(2k-1)!!sigma_X^{2k}mu_X^{m-2k}.$$



    and similar for $Y^n$. However, evaluating $E(X^mY^n)$ is the issue. See here how messy the distribution of just two non-independent normal random variables is, yet you have the product of $m+n$ of them!






    share|cite|improve this answer











    $endgroup$


















      -1












      $begingroup$

      I do not think that there is a nice answer here, sorry.$newcommand{Cov}{operatorname{Cov}}$



      Note firstly that
      $$
      Cov(X^m,Y^n)=E(X^mY^n)-E(X^m)E(Y^n)
      $$

      by definition of the covariance.



      This answer provides you with:



      $$E(X^m)=sum_{k=0}^{lfloor m/2rfloor} {m choose 2k}(2k-1)!!sigma_X^{2k}mu_X^{m-2k}.$$



      and similar for $Y^n$. However, evaluating $E(X^mY^n)$ is the issue. See here how messy the distribution of just two non-independent normal random variables is, yet you have the product of $m+n$ of them!






      share|cite|improve this answer











      $endgroup$
















        -1












        -1








        -1





        $begingroup$

        I do not think that there is a nice answer here, sorry.$newcommand{Cov}{operatorname{Cov}}$



        Note firstly that
        $$
        Cov(X^m,Y^n)=E(X^mY^n)-E(X^m)E(Y^n)
        $$

        by definition of the covariance.



        This answer provides you with:



        $$E(X^m)=sum_{k=0}^{lfloor m/2rfloor} {m choose 2k}(2k-1)!!sigma_X^{2k}mu_X^{m-2k}.$$



        and similar for $Y^n$. However, evaluating $E(X^mY^n)$ is the issue. See here how messy the distribution of just two non-independent normal random variables is, yet you have the product of $m+n$ of them!






        share|cite|improve this answer











        $endgroup$



        I do not think that there is a nice answer here, sorry.$newcommand{Cov}{operatorname{Cov}}$



        Note firstly that
        $$
        Cov(X^m,Y^n)=E(X^mY^n)-E(X^m)E(Y^n)
        $$

        by definition of the covariance.



        This answer provides you with:



        $$E(X^m)=sum_{k=0}^{lfloor m/2rfloor} {m choose 2k}(2k-1)!!sigma_X^{2k}mu_X^{m-2k}.$$



        and similar for $Y^n$. However, evaluating $E(X^mY^n)$ is the issue. See here how messy the distribution of just two non-independent normal random variables is, yet you have the product of $m+n$ of them!







        share|cite|improve this answer














        share|cite|improve this answer



        share|cite|improve this answer








        edited Jan 29 at 11:39









        Martin Sleziak

        44.9k10122277




        44.9k10122277










        answered Aug 7 '18 at 22:37









        MalkinMalkin

        1,512726




        1,512726























            -2












            $begingroup$

            You have said$newcommand{Cov}{operatorname{Cov}}$
            begin{align}
            & Xsim N(mu_X,sigma^2_X), \[4pt]
            & Y sim N(mu_Y,sigma^2_Y), \[4pt]
            & Cov(X,Y) = 0.
            end{align}

            That falls short of specifying the joint distribution of $(X,Y).$ If it were further specified that $X,Y$ are jointly normally distributed, then the covariance can be $0$ only if $X,Y$ are independent. If $X,Y$ are independent, then so are $X^m,Y^n,$ so their covariance is also $0.$



            Here is a simple example: Suppose $Xsim N(mu_X, sigma_X^2),$ let $Z = text{the “z-score''} = (X-mu_X)/sigma_X,$ and independently of $X$ you toss a coin. Then let $Y = mu_Y pm sigma_Y Z, $ where the choice between $text{“}pmtext{''}$ is determined by the coin toss. Then $X,Y$ have covariance $0$ and have just the distributions you specified in the question, but they are NOT JOINTLY normally distributed and not independent.



            But if you assume joint normality, which in the case of two random variables means bivariate normality, then the answer is just as in the first paragraph above.






            share|cite|improve this answer











            $endgroup$













            • $begingroup$
              Thanks, but my original question states that Cov(X,Y) is not equal to zero. I edited it to specify a joint normal distribution.
              $endgroup$
              – Mathieu
              Aug 8 '18 at 6:05










            • $begingroup$
              @Mathieu : In that case, my first paragraph above answers your question, and the answer by E.Malkin misses the point at best. There is no need to know the variance of $X^m$ or even the expected value of $X^m$ in order to answer your question.
              $endgroup$
              – Michael Hardy
              Aug 8 '18 at 12:41










            • $begingroup$
              Sorry but I must be missing something: my question states that Cov(X,Y) is not zero and your first paragraph above states the opposite. Not sure what to make of that.
              $endgroup$
              – Mathieu
              Aug 8 '18 at 13:14












            • $begingroup$
              @MichaelHardy I also don't see how your first paragraph answers the question and could you please explain how I've missed the point ("at best")? Knowing $E(X^m)$ means that finding $Cov (X^m,Y^n)$ is equivalent to finding $E(X^mY^n)$, which surely isn't invalid?
              $endgroup$
              – Malkin
              Aug 8 '18 at 22:04










            • $begingroup$
              @Malkin : If $X,Y$ are not just separately normal but jointly normal, then having zero covariance entails that they are independent. If $X,Y$ are independent, then $X^m,Y^n$ are independent. If $X^m, Y^n$ are independent, then $operatorname E(X^m Y^n) = operatorname E(X^m) operatorname E(Y^n).$ Therefore $$operatorname{cov}(X^m, Y^n) = operatorname E(X^m Y^n) - operatorname E(X^m) operatorname E(Y^n) = operatorname E(X^m) operatorname E(Y^n) - operatorname E(X^m) operatorname E(Y^m) = 0.$$
              $endgroup$
              – Michael Hardy
              Aug 9 '18 at 2:04


















            -2












            $begingroup$

            You have said$newcommand{Cov}{operatorname{Cov}}$
            begin{align}
            & Xsim N(mu_X,sigma^2_X), \[4pt]
            & Y sim N(mu_Y,sigma^2_Y), \[4pt]
            & Cov(X,Y) = 0.
            end{align}

            That falls short of specifying the joint distribution of $(X,Y).$ If it were further specified that $X,Y$ are jointly normally distributed, then the covariance can be $0$ only if $X,Y$ are independent. If $X,Y$ are independent, then so are $X^m,Y^n,$ so their covariance is also $0.$



            Here is a simple example: Suppose $Xsim N(mu_X, sigma_X^2),$ let $Z = text{the “z-score''} = (X-mu_X)/sigma_X,$ and independently of $X$ you toss a coin. Then let $Y = mu_Y pm sigma_Y Z, $ where the choice between $text{“}pmtext{''}$ is determined by the coin toss. Then $X,Y$ have covariance $0$ and have just the distributions you specified in the question, but they are NOT JOINTLY normally distributed and not independent.



            But if you assume joint normality, which in the case of two random variables means bivariate normality, then the answer is just as in the first paragraph above.






            share|cite|improve this answer











            $endgroup$













            • $begingroup$
              Thanks, but my original question states that Cov(X,Y) is not equal to zero. I edited it to specify a joint normal distribution.
              $endgroup$
              – Mathieu
              Aug 8 '18 at 6:05










            • $begingroup$
              @Mathieu : In that case, my first paragraph above answers your question, and the answer by E.Malkin misses the point at best. There is no need to know the variance of $X^m$ or even the expected value of $X^m$ in order to answer your question.
              $endgroup$
              – Michael Hardy
              Aug 8 '18 at 12:41










            • $begingroup$
              Sorry but I must be missing something: my question states that Cov(X,Y) is not zero and your first paragraph above states the opposite. Not sure what to make of that.
              $endgroup$
              – Mathieu
              Aug 8 '18 at 13:14












            • $begingroup$
              @MichaelHardy I also don't see how your first paragraph answers the question and could you please explain how I've missed the point ("at best")? Knowing $E(X^m)$ means that finding $Cov (X^m,Y^n)$ is equivalent to finding $E(X^mY^n)$, which surely isn't invalid?
              $endgroup$
              – Malkin
              Aug 8 '18 at 22:04










            • $begingroup$
              @Malkin : If $X,Y$ are not just separately normal but jointly normal, then having zero covariance entails that they are independent. If $X,Y$ are independent, then $X^m,Y^n$ are independent. If $X^m, Y^n$ are independent, then $operatorname E(X^m Y^n) = operatorname E(X^m) operatorname E(Y^n).$ Therefore $$operatorname{cov}(X^m, Y^n) = operatorname E(X^m Y^n) - operatorname E(X^m) operatorname E(Y^n) = operatorname E(X^m) operatorname E(Y^n) - operatorname E(X^m) operatorname E(Y^m) = 0.$$
              $endgroup$
              – Michael Hardy
              Aug 9 '18 at 2:04
















            -2












            -2








            -2





            $begingroup$

            You have said$newcommand{Cov}{operatorname{Cov}}$
            begin{align}
            & Xsim N(mu_X,sigma^2_X), \[4pt]
            & Y sim N(mu_Y,sigma^2_Y), \[4pt]
            & Cov(X,Y) = 0.
            end{align}

            That falls short of specifying the joint distribution of $(X,Y).$ If it were further specified that $X,Y$ are jointly normally distributed, then the covariance can be $0$ only if $X,Y$ are independent. If $X,Y$ are independent, then so are $X^m,Y^n,$ so their covariance is also $0.$



            Here is a simple example: Suppose $Xsim N(mu_X, sigma_X^2),$ let $Z = text{the “z-score''} = (X-mu_X)/sigma_X,$ and independently of $X$ you toss a coin. Then let $Y = mu_Y pm sigma_Y Z, $ where the choice between $text{“}pmtext{''}$ is determined by the coin toss. Then $X,Y$ have covariance $0$ and have just the distributions you specified in the question, but they are NOT JOINTLY normally distributed and not independent.



            But if you assume joint normality, which in the case of two random variables means bivariate normality, then the answer is just as in the first paragraph above.






            share|cite|improve this answer











            $endgroup$



            You have said$newcommand{Cov}{operatorname{Cov}}$
            begin{align}
            & Xsim N(mu_X,sigma^2_X), \[4pt]
            & Y sim N(mu_Y,sigma^2_Y), \[4pt]
            & Cov(X,Y) = 0.
            end{align}

            That falls short of specifying the joint distribution of $(X,Y).$ If it were further specified that $X,Y$ are jointly normally distributed, then the covariance can be $0$ only if $X,Y$ are independent. If $X,Y$ are independent, then so are $X^m,Y^n,$ so their covariance is also $0.$



            Here is a simple example: Suppose $Xsim N(mu_X, sigma_X^2),$ let $Z = text{the “z-score''} = (X-mu_X)/sigma_X,$ and independently of $X$ you toss a coin. Then let $Y = mu_Y pm sigma_Y Z, $ where the choice between $text{“}pmtext{''}$ is determined by the coin toss. Then $X,Y$ have covariance $0$ and have just the distributions you specified in the question, but they are NOT JOINTLY normally distributed and not independent.



            But if you assume joint normality, which in the case of two random variables means bivariate normality, then the answer is just as in the first paragraph above.







            share|cite|improve this answer














            share|cite|improve this answer



            share|cite|improve this answer








            edited Jan 29 at 11:39









            Martin Sleziak

            44.9k10122277




            44.9k10122277










            answered Aug 8 '18 at 0:41









            Michael HardyMichael Hardy

            1




            1












            • $begingroup$
              Thanks, but my original question states that Cov(X,Y) is not equal to zero. I edited it to specify a joint normal distribution.
              $endgroup$
              – Mathieu
              Aug 8 '18 at 6:05










            • $begingroup$
              @Mathieu : In that case, my first paragraph above answers your question, and the answer by E.Malkin misses the point at best. There is no need to know the variance of $X^m$ or even the expected value of $X^m$ in order to answer your question.
              $endgroup$
              – Michael Hardy
              Aug 8 '18 at 12:41










            • $begingroup$
              Sorry but I must be missing something: my question states that Cov(X,Y) is not zero and your first paragraph above states the opposite. Not sure what to make of that.
              $endgroup$
              – Mathieu
              Aug 8 '18 at 13:14












            • $begingroup$
              @MichaelHardy I also don't see how your first paragraph answers the question and could you please explain how I've missed the point ("at best")? Knowing $E(X^m)$ means that finding $Cov (X^m,Y^n)$ is equivalent to finding $E(X^mY^n)$, which surely isn't invalid?
              $endgroup$
              – Malkin
              Aug 8 '18 at 22:04










            • $begingroup$
              @Malkin : If $X,Y$ are not just separately normal but jointly normal, then having zero covariance entails that they are independent. If $X,Y$ are independent, then $X^m,Y^n$ are independent. If $X^m, Y^n$ are independent, then $operatorname E(X^m Y^n) = operatorname E(X^m) operatorname E(Y^n).$ Therefore $$operatorname{cov}(X^m, Y^n) = operatorname E(X^m Y^n) - operatorname E(X^m) operatorname E(Y^n) = operatorname E(X^m) operatorname E(Y^n) - operatorname E(X^m) operatorname E(Y^m) = 0.$$
              $endgroup$
              – Michael Hardy
              Aug 9 '18 at 2:04




















            • $begingroup$
              Thanks, but my original question states that Cov(X,Y) is not equal to zero. I edited it to specify a joint normal distribution.
              $endgroup$
              – Mathieu
              Aug 8 '18 at 6:05










            • $begingroup$
              @Mathieu : In that case, my first paragraph above answers your question, and the answer by E.Malkin misses the point at best. There is no need to know the variance of $X^m$ or even the expected value of $X^m$ in order to answer your question.
              $endgroup$
              – Michael Hardy
              Aug 8 '18 at 12:41










            • $begingroup$
              Sorry but I must be missing something: my question states that Cov(X,Y) is not zero and your first paragraph above states the opposite. Not sure what to make of that.
              $endgroup$
              – Mathieu
              Aug 8 '18 at 13:14












            • $begingroup$
              @MichaelHardy I also don't see how your first paragraph answers the question and could you please explain how I've missed the point ("at best")? Knowing $E(X^m)$ means that finding $Cov (X^m,Y^n)$ is equivalent to finding $E(X^mY^n)$, which surely isn't invalid?
              $endgroup$
              – Malkin
              Aug 8 '18 at 22:04










            • $begingroup$
              @Malkin : If $X,Y$ are not just separately normal but jointly normal, then having zero covariance entails that they are independent. If $X,Y$ are independent, then $X^m,Y^n$ are independent. If $X^m, Y^n$ are independent, then $operatorname E(X^m Y^n) = operatorname E(X^m) operatorname E(Y^n).$ Therefore $$operatorname{cov}(X^m, Y^n) = operatorname E(X^m Y^n) - operatorname E(X^m) operatorname E(Y^n) = operatorname E(X^m) operatorname E(Y^n) - operatorname E(X^m) operatorname E(Y^m) = 0.$$
              $endgroup$
              – Michael Hardy
              Aug 9 '18 at 2:04


















            $begingroup$
            Thanks, but my original question states that Cov(X,Y) is not equal to zero. I edited it to specify a joint normal distribution.
            $endgroup$
            – Mathieu
            Aug 8 '18 at 6:05




            $begingroup$
            Thanks, but my original question states that Cov(X,Y) is not equal to zero. I edited it to specify a joint normal distribution.
            $endgroup$
            – Mathieu
            Aug 8 '18 at 6:05












            $begingroup$
            @Mathieu : In that case, my first paragraph above answers your question, and the answer by E.Malkin misses the point at best. There is no need to know the variance of $X^m$ or even the expected value of $X^m$ in order to answer your question.
            $endgroup$
            – Michael Hardy
            Aug 8 '18 at 12:41




            $begingroup$
            @Mathieu : In that case, my first paragraph above answers your question, and the answer by E.Malkin misses the point at best. There is no need to know the variance of $X^m$ or even the expected value of $X^m$ in order to answer your question.
            $endgroup$
            – Michael Hardy
            Aug 8 '18 at 12:41












            $begingroup$
            Sorry but I must be missing something: my question states that Cov(X,Y) is not zero and your first paragraph above states the opposite. Not sure what to make of that.
            $endgroup$
            – Mathieu
            Aug 8 '18 at 13:14






            $begingroup$
            Sorry but I must be missing something: my question states that Cov(X,Y) is not zero and your first paragraph above states the opposite. Not sure what to make of that.
            $endgroup$
            – Mathieu
            Aug 8 '18 at 13:14














            $begingroup$
            @MichaelHardy I also don't see how your first paragraph answers the question and could you please explain how I've missed the point ("at best")? Knowing $E(X^m)$ means that finding $Cov (X^m,Y^n)$ is equivalent to finding $E(X^mY^n)$, which surely isn't invalid?
            $endgroup$
            – Malkin
            Aug 8 '18 at 22:04




            $begingroup$
            @MichaelHardy I also don't see how your first paragraph answers the question and could you please explain how I've missed the point ("at best")? Knowing $E(X^m)$ means that finding $Cov (X^m,Y^n)$ is equivalent to finding $E(X^mY^n)$, which surely isn't invalid?
            $endgroup$
            – Malkin
            Aug 8 '18 at 22:04












            $begingroup$
            @Malkin : If $X,Y$ are not just separately normal but jointly normal, then having zero covariance entails that they are independent. If $X,Y$ are independent, then $X^m,Y^n$ are independent. If $X^m, Y^n$ are independent, then $operatorname E(X^m Y^n) = operatorname E(X^m) operatorname E(Y^n).$ Therefore $$operatorname{cov}(X^m, Y^n) = operatorname E(X^m Y^n) - operatorname E(X^m) operatorname E(Y^n) = operatorname E(X^m) operatorname E(Y^n) - operatorname E(X^m) operatorname E(Y^m) = 0.$$
            $endgroup$
            – Michael Hardy
            Aug 9 '18 at 2:04






            $begingroup$
            @Malkin : If $X,Y$ are not just separately normal but jointly normal, then having zero covariance entails that they are independent. If $X,Y$ are independent, then $X^m,Y^n$ are independent. If $X^m, Y^n$ are independent, then $operatorname E(X^m Y^n) = operatorname E(X^m) operatorname E(Y^n).$ Therefore $$operatorname{cov}(X^m, Y^n) = operatorname E(X^m Y^n) - operatorname E(X^m) operatorname E(Y^n) = operatorname E(X^m) operatorname E(Y^n) - operatorname E(X^m) operatorname E(Y^m) = 0.$$
            $endgroup$
            – Michael Hardy
            Aug 9 '18 at 2:04




















            draft saved

            draft discarded




















































            Thanks for contributing an answer to Mathematics Stack Exchange!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            Use MathJax to format equations. MathJax reference.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2875466%2fcovariance-of-polynomials-of-random-normal-variables%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            Can a sorcerer learn a 5th-level spell early by creating spell slots using the Font of Magic feature?

            Does disintegrating a polymorphed enemy still kill it after the 2018 errata?

            A Topological Invariant for $pi_3(U(n))$