Variance of a mixed variable












1












$begingroup$


I am studying probability on my own and couldn't find anything on this in my textbooks.
Say we have two random variables, $X$ and $Y$, with known means $mu_{1}$ and $mu_{2}$, and known variances $sigma^{2}_{1}$ and $sigma^{2}_{2}$.
$A$ is defined to be a linear combination of both variables. How can you interpret the variance of $A$? And how can you calculate $textbf{E}(A^{2})$ in order to find such variance?



Does this generalize to more complex functions $A(X,Y)$?










share|cite|improve this question











$endgroup$












  • $begingroup$
    Concerning the calculation of the variance of a sum of random variables, search for "Bienaymé formula". Concerning the interpretation: What exactly are you thinking of? Do you have a general understanding of what the variance of a random variable is? Why should it be different for a sum? Concerning generelizations: Any formula you know for $E(f(X))$ (e.g. if you know a density) can be applied to calculate the variance. Is this what you have in mind?
    $endgroup$
    – Mars Plastic
    Feb 3 at 1:21










  • $begingroup$
    But in general, you cannot expect to calculate the variance of $X+Y$ just from the variances of $X$ and $Y$. If for example $X=Y$, we have $Var(X+Y)=4Var(X)$, but if they have the same distribution but are independent, we have $Var(X+Y)=2Var(X)$
    $endgroup$
    – Mars Plastic
    Feb 3 at 1:25










  • $begingroup$
    I guess I was not paying attention to the role of independence in how the sum might vary. Taking independence into account, I think I now see how the meaning of a variance of a sum not that different. Can you direct me to a link to learn more about expectations of functions of random variables?
    $endgroup$
    – Kumail Alhamoud
    Feb 3 at 1:56
















1












$begingroup$


I am studying probability on my own and couldn't find anything on this in my textbooks.
Say we have two random variables, $X$ and $Y$, with known means $mu_{1}$ and $mu_{2}$, and known variances $sigma^{2}_{1}$ and $sigma^{2}_{2}$.
$A$ is defined to be a linear combination of both variables. How can you interpret the variance of $A$? And how can you calculate $textbf{E}(A^{2})$ in order to find such variance?



Does this generalize to more complex functions $A(X,Y)$?










share|cite|improve this question











$endgroup$












  • $begingroup$
    Concerning the calculation of the variance of a sum of random variables, search for "Bienaymé formula". Concerning the interpretation: What exactly are you thinking of? Do you have a general understanding of what the variance of a random variable is? Why should it be different for a sum? Concerning generelizations: Any formula you know for $E(f(X))$ (e.g. if you know a density) can be applied to calculate the variance. Is this what you have in mind?
    $endgroup$
    – Mars Plastic
    Feb 3 at 1:21










  • $begingroup$
    But in general, you cannot expect to calculate the variance of $X+Y$ just from the variances of $X$ and $Y$. If for example $X=Y$, we have $Var(X+Y)=4Var(X)$, but if they have the same distribution but are independent, we have $Var(X+Y)=2Var(X)$
    $endgroup$
    – Mars Plastic
    Feb 3 at 1:25










  • $begingroup$
    I guess I was not paying attention to the role of independence in how the sum might vary. Taking independence into account, I think I now see how the meaning of a variance of a sum not that different. Can you direct me to a link to learn more about expectations of functions of random variables?
    $endgroup$
    – Kumail Alhamoud
    Feb 3 at 1:56














1












1








1





$begingroup$


I am studying probability on my own and couldn't find anything on this in my textbooks.
Say we have two random variables, $X$ and $Y$, with known means $mu_{1}$ and $mu_{2}$, and known variances $sigma^{2}_{1}$ and $sigma^{2}_{2}$.
$A$ is defined to be a linear combination of both variables. How can you interpret the variance of $A$? And how can you calculate $textbf{E}(A^{2})$ in order to find such variance?



Does this generalize to more complex functions $A(X,Y)$?










share|cite|improve this question











$endgroup$




I am studying probability on my own and couldn't find anything on this in my textbooks.
Say we have two random variables, $X$ and $Y$, with known means $mu_{1}$ and $mu_{2}$, and known variances $sigma^{2}_{1}$ and $sigma^{2}_{2}$.
$A$ is defined to be a linear combination of both variables. How can you interpret the variance of $A$? And how can you calculate $textbf{E}(A^{2})$ in order to find such variance?



Does this generalize to more complex functions $A(X,Y)$?







probability variance






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Feb 3 at 1:12









APC89

2,371720




2,371720










asked Feb 3 at 0:59









Kumail AlhamoudKumail Alhamoud

83




83












  • $begingroup$
    Concerning the calculation of the variance of a sum of random variables, search for "Bienaymé formula". Concerning the interpretation: What exactly are you thinking of? Do you have a general understanding of what the variance of a random variable is? Why should it be different for a sum? Concerning generelizations: Any formula you know for $E(f(X))$ (e.g. if you know a density) can be applied to calculate the variance. Is this what you have in mind?
    $endgroup$
    – Mars Plastic
    Feb 3 at 1:21










  • $begingroup$
    But in general, you cannot expect to calculate the variance of $X+Y$ just from the variances of $X$ and $Y$. If for example $X=Y$, we have $Var(X+Y)=4Var(X)$, but if they have the same distribution but are independent, we have $Var(X+Y)=2Var(X)$
    $endgroup$
    – Mars Plastic
    Feb 3 at 1:25










  • $begingroup$
    I guess I was not paying attention to the role of independence in how the sum might vary. Taking independence into account, I think I now see how the meaning of a variance of a sum not that different. Can you direct me to a link to learn more about expectations of functions of random variables?
    $endgroup$
    – Kumail Alhamoud
    Feb 3 at 1:56


















  • $begingroup$
    Concerning the calculation of the variance of a sum of random variables, search for "Bienaymé formula". Concerning the interpretation: What exactly are you thinking of? Do you have a general understanding of what the variance of a random variable is? Why should it be different for a sum? Concerning generelizations: Any formula you know for $E(f(X))$ (e.g. if you know a density) can be applied to calculate the variance. Is this what you have in mind?
    $endgroup$
    – Mars Plastic
    Feb 3 at 1:21










  • $begingroup$
    But in general, you cannot expect to calculate the variance of $X+Y$ just from the variances of $X$ and $Y$. If for example $X=Y$, we have $Var(X+Y)=4Var(X)$, but if they have the same distribution but are independent, we have $Var(X+Y)=2Var(X)$
    $endgroup$
    – Mars Plastic
    Feb 3 at 1:25










  • $begingroup$
    I guess I was not paying attention to the role of independence in how the sum might vary. Taking independence into account, I think I now see how the meaning of a variance of a sum not that different. Can you direct me to a link to learn more about expectations of functions of random variables?
    $endgroup$
    – Kumail Alhamoud
    Feb 3 at 1:56
















$begingroup$
Concerning the calculation of the variance of a sum of random variables, search for "Bienaymé formula". Concerning the interpretation: What exactly are you thinking of? Do you have a general understanding of what the variance of a random variable is? Why should it be different for a sum? Concerning generelizations: Any formula you know for $E(f(X))$ (e.g. if you know a density) can be applied to calculate the variance. Is this what you have in mind?
$endgroup$
– Mars Plastic
Feb 3 at 1:21




$begingroup$
Concerning the calculation of the variance of a sum of random variables, search for "Bienaymé formula". Concerning the interpretation: What exactly are you thinking of? Do you have a general understanding of what the variance of a random variable is? Why should it be different for a sum? Concerning generelizations: Any formula you know for $E(f(X))$ (e.g. if you know a density) can be applied to calculate the variance. Is this what you have in mind?
$endgroup$
– Mars Plastic
Feb 3 at 1:21












$begingroup$
But in general, you cannot expect to calculate the variance of $X+Y$ just from the variances of $X$ and $Y$. If for example $X=Y$, we have $Var(X+Y)=4Var(X)$, but if they have the same distribution but are independent, we have $Var(X+Y)=2Var(X)$
$endgroup$
– Mars Plastic
Feb 3 at 1:25




$begingroup$
But in general, you cannot expect to calculate the variance of $X+Y$ just from the variances of $X$ and $Y$. If for example $X=Y$, we have $Var(X+Y)=4Var(X)$, but if they have the same distribution but are independent, we have $Var(X+Y)=2Var(X)$
$endgroup$
– Mars Plastic
Feb 3 at 1:25












$begingroup$
I guess I was not paying attention to the role of independence in how the sum might vary. Taking independence into account, I think I now see how the meaning of a variance of a sum not that different. Can you direct me to a link to learn more about expectations of functions of random variables?
$endgroup$
– Kumail Alhamoud
Feb 3 at 1:56




$begingroup$
I guess I was not paying attention to the role of independence in how the sum might vary. Taking independence into account, I think I now see how the meaning of a variance of a sum not that different. Can you direct me to a link to learn more about expectations of functions of random variables?
$endgroup$
– Kumail Alhamoud
Feb 3 at 1:56










1 Answer
1






active

oldest

votes


















0












$begingroup$

According to the data provided and the properties of variance, we have
begin{align*}
textbf{Var}(A) = textbf{Var}(aX+bY) = a^{2}textbf{Var}(X) + b^{2}textbf{Var}(Y) = a^{2}sigma^{2}_{1} + b^{2}sigma^{2}_{2}
end{align*}



Once $X$ and $Y$ are independent, they are uncorrelated, which means that $textbf{Cov}(X,Y) = 0$ and, consequently, the formula above is correct.



As you may have noticed, we did not need to use the value $textbf{E}(A^{2})$ directly in order to obtain the variance: it is enough to make use of the variance properties.



As to the general case, the strategy to solve the problem depends on the expression of $A(X,Y)$.






share|cite|improve this answer









$endgroup$













  • $begingroup$
    I see how the general case might vary. As for 𝐄(𝐴^2), in general, would you just expand the square of A and then see if you get something you can work with?
    $endgroup$
    – Kumail Alhamoud
    Feb 3 at 1:59










  • $begingroup$
    In accordance to the given information, a good start point would be to notice that $$textbf{Var}(A) = textbf{E}(A^{2}) - textbf{E}(A)^{2}$$ Once we know the value of $textbf{Var}(A)$ and $textbf{E}(A)$ can be easily calculated, you obtain the value of $textbf{E}(A^{2})$. More precisely, we have $$textbf{E}(A) = textbf{E}(aX+bY) = atextbf{E}(X) + btextbf{E}(Y) = amu_{1} + bmu_{2}$$
    $endgroup$
    – APC89
    Feb 3 at 2:05












Your Answer








StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














draft saved

draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3098026%2fvariance-of-a-mixed-variable%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown

























1 Answer
1






active

oldest

votes








1 Answer
1






active

oldest

votes









active

oldest

votes






active

oldest

votes









0












$begingroup$

According to the data provided and the properties of variance, we have
begin{align*}
textbf{Var}(A) = textbf{Var}(aX+bY) = a^{2}textbf{Var}(X) + b^{2}textbf{Var}(Y) = a^{2}sigma^{2}_{1} + b^{2}sigma^{2}_{2}
end{align*}



Once $X$ and $Y$ are independent, they are uncorrelated, which means that $textbf{Cov}(X,Y) = 0$ and, consequently, the formula above is correct.



As you may have noticed, we did not need to use the value $textbf{E}(A^{2})$ directly in order to obtain the variance: it is enough to make use of the variance properties.



As to the general case, the strategy to solve the problem depends on the expression of $A(X,Y)$.






share|cite|improve this answer









$endgroup$













  • $begingroup$
    I see how the general case might vary. As for 𝐄(𝐴^2), in general, would you just expand the square of A and then see if you get something you can work with?
    $endgroup$
    – Kumail Alhamoud
    Feb 3 at 1:59










  • $begingroup$
    In accordance to the given information, a good start point would be to notice that $$textbf{Var}(A) = textbf{E}(A^{2}) - textbf{E}(A)^{2}$$ Once we know the value of $textbf{Var}(A)$ and $textbf{E}(A)$ can be easily calculated, you obtain the value of $textbf{E}(A^{2})$. More precisely, we have $$textbf{E}(A) = textbf{E}(aX+bY) = atextbf{E}(X) + btextbf{E}(Y) = amu_{1} + bmu_{2}$$
    $endgroup$
    – APC89
    Feb 3 at 2:05
















0












$begingroup$

According to the data provided and the properties of variance, we have
begin{align*}
textbf{Var}(A) = textbf{Var}(aX+bY) = a^{2}textbf{Var}(X) + b^{2}textbf{Var}(Y) = a^{2}sigma^{2}_{1} + b^{2}sigma^{2}_{2}
end{align*}



Once $X$ and $Y$ are independent, they are uncorrelated, which means that $textbf{Cov}(X,Y) = 0$ and, consequently, the formula above is correct.



As you may have noticed, we did not need to use the value $textbf{E}(A^{2})$ directly in order to obtain the variance: it is enough to make use of the variance properties.



As to the general case, the strategy to solve the problem depends on the expression of $A(X,Y)$.






share|cite|improve this answer









$endgroup$













  • $begingroup$
    I see how the general case might vary. As for 𝐄(𝐴^2), in general, would you just expand the square of A and then see if you get something you can work with?
    $endgroup$
    – Kumail Alhamoud
    Feb 3 at 1:59










  • $begingroup$
    In accordance to the given information, a good start point would be to notice that $$textbf{Var}(A) = textbf{E}(A^{2}) - textbf{E}(A)^{2}$$ Once we know the value of $textbf{Var}(A)$ and $textbf{E}(A)$ can be easily calculated, you obtain the value of $textbf{E}(A^{2})$. More precisely, we have $$textbf{E}(A) = textbf{E}(aX+bY) = atextbf{E}(X) + btextbf{E}(Y) = amu_{1} + bmu_{2}$$
    $endgroup$
    – APC89
    Feb 3 at 2:05














0












0








0





$begingroup$

According to the data provided and the properties of variance, we have
begin{align*}
textbf{Var}(A) = textbf{Var}(aX+bY) = a^{2}textbf{Var}(X) + b^{2}textbf{Var}(Y) = a^{2}sigma^{2}_{1} + b^{2}sigma^{2}_{2}
end{align*}



Once $X$ and $Y$ are independent, they are uncorrelated, which means that $textbf{Cov}(X,Y) = 0$ and, consequently, the formula above is correct.



As you may have noticed, we did not need to use the value $textbf{E}(A^{2})$ directly in order to obtain the variance: it is enough to make use of the variance properties.



As to the general case, the strategy to solve the problem depends on the expression of $A(X,Y)$.






share|cite|improve this answer









$endgroup$



According to the data provided and the properties of variance, we have
begin{align*}
textbf{Var}(A) = textbf{Var}(aX+bY) = a^{2}textbf{Var}(X) + b^{2}textbf{Var}(Y) = a^{2}sigma^{2}_{1} + b^{2}sigma^{2}_{2}
end{align*}



Once $X$ and $Y$ are independent, they are uncorrelated, which means that $textbf{Cov}(X,Y) = 0$ and, consequently, the formula above is correct.



As you may have noticed, we did not need to use the value $textbf{E}(A^{2})$ directly in order to obtain the variance: it is enough to make use of the variance properties.



As to the general case, the strategy to solve the problem depends on the expression of $A(X,Y)$.







share|cite|improve this answer












share|cite|improve this answer



share|cite|improve this answer










answered Feb 3 at 1:26









APC89APC89

2,371720




2,371720












  • $begingroup$
    I see how the general case might vary. As for 𝐄(𝐴^2), in general, would you just expand the square of A and then see if you get something you can work with?
    $endgroup$
    – Kumail Alhamoud
    Feb 3 at 1:59










  • $begingroup$
    In accordance to the given information, a good start point would be to notice that $$textbf{Var}(A) = textbf{E}(A^{2}) - textbf{E}(A)^{2}$$ Once we know the value of $textbf{Var}(A)$ and $textbf{E}(A)$ can be easily calculated, you obtain the value of $textbf{E}(A^{2})$. More precisely, we have $$textbf{E}(A) = textbf{E}(aX+bY) = atextbf{E}(X) + btextbf{E}(Y) = amu_{1} + bmu_{2}$$
    $endgroup$
    – APC89
    Feb 3 at 2:05


















  • $begingroup$
    I see how the general case might vary. As for 𝐄(𝐴^2), in general, would you just expand the square of A and then see if you get something you can work with?
    $endgroup$
    – Kumail Alhamoud
    Feb 3 at 1:59










  • $begingroup$
    In accordance to the given information, a good start point would be to notice that $$textbf{Var}(A) = textbf{E}(A^{2}) - textbf{E}(A)^{2}$$ Once we know the value of $textbf{Var}(A)$ and $textbf{E}(A)$ can be easily calculated, you obtain the value of $textbf{E}(A^{2})$. More precisely, we have $$textbf{E}(A) = textbf{E}(aX+bY) = atextbf{E}(X) + btextbf{E}(Y) = amu_{1} + bmu_{2}$$
    $endgroup$
    – APC89
    Feb 3 at 2:05
















$begingroup$
I see how the general case might vary. As for 𝐄(𝐴^2), in general, would you just expand the square of A and then see if you get something you can work with?
$endgroup$
– Kumail Alhamoud
Feb 3 at 1:59




$begingroup$
I see how the general case might vary. As for 𝐄(𝐴^2), in general, would you just expand the square of A and then see if you get something you can work with?
$endgroup$
– Kumail Alhamoud
Feb 3 at 1:59












$begingroup$
In accordance to the given information, a good start point would be to notice that $$textbf{Var}(A) = textbf{E}(A^{2}) - textbf{E}(A)^{2}$$ Once we know the value of $textbf{Var}(A)$ and $textbf{E}(A)$ can be easily calculated, you obtain the value of $textbf{E}(A^{2})$. More precisely, we have $$textbf{E}(A) = textbf{E}(aX+bY) = atextbf{E}(X) + btextbf{E}(Y) = amu_{1} + bmu_{2}$$
$endgroup$
– APC89
Feb 3 at 2:05




$begingroup$
In accordance to the given information, a good start point would be to notice that $$textbf{Var}(A) = textbf{E}(A^{2}) - textbf{E}(A)^{2}$$ Once we know the value of $textbf{Var}(A)$ and $textbf{E}(A)$ can be easily calculated, you obtain the value of $textbf{E}(A^{2})$. More precisely, we have $$textbf{E}(A) = textbf{E}(aX+bY) = atextbf{E}(X) + btextbf{E}(Y) = amu_{1} + bmu_{2}$$
$endgroup$
– APC89
Feb 3 at 2:05


















draft saved

draft discarded




















































Thanks for contributing an answer to Mathematics Stack Exchange!


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


Use MathJax to format equations. MathJax reference.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3098026%2fvariance-of-a-mixed-variable%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

MongoDB - Not Authorized To Execute Command

How to fix TextFormField cause rebuild widget in Flutter

in spring boot 2.1 many test slices are not allowed anymore due to multiple @BootstrapWith