Adding Independent Random Variables Given Their Individual Expectations and Variance
$begingroup$
How do I add or subtract independent random variables (R.Vs) when given their individual expectations and variance?
I'm a student in high school and I haven't covered distributions yet, so please try not to use them.
Example, R.Vs A, B & C
Where
$E(A)= 35;; Var(A)=8\
E(B) = 25;;;; Var(B)=9\$
Calculate the expectation and variance of:
$A + 2B$
probability random-variables variance expected-value
$endgroup$
add a comment |
$begingroup$
How do I add or subtract independent random variables (R.Vs) when given their individual expectations and variance?
I'm a student in high school and I haven't covered distributions yet, so please try not to use them.
Example, R.Vs A, B & C
Where
$E(A)= 35;; Var(A)=8\
E(B) = 25;;;; Var(B)=9\$
Calculate the expectation and variance of:
$A + 2B$
probability random-variables variance expected-value
$endgroup$
add a comment |
$begingroup$
How do I add or subtract independent random variables (R.Vs) when given their individual expectations and variance?
I'm a student in high school and I haven't covered distributions yet, so please try not to use them.
Example, R.Vs A, B & C
Where
$E(A)= 35;; Var(A)=8\
E(B) = 25;;;; Var(B)=9\$
Calculate the expectation and variance of:
$A + 2B$
probability random-variables variance expected-value
$endgroup$
How do I add or subtract independent random variables (R.Vs) when given their individual expectations and variance?
I'm a student in high school and I haven't covered distributions yet, so please try not to use them.
Example, R.Vs A, B & C
Where
$E(A)= 35;; Var(A)=8\
E(B) = 25;;;; Var(B)=9\$
Calculate the expectation and variance of:
$A + 2B$
probability random-variables variance expected-value
probability random-variables variance expected-value
edited Jan 28 at 10:00
landlockedorca
asked Jan 28 at 1:35


landlockedorcalandlockedorca
83
83
add a comment |
add a comment |
2 Answers
2
active
oldest
votes
$begingroup$
Let $A$ and $B$ be two random variables and $c$ be a constant. Then,
$mathbb{E}[A + cB] = mathbb{E}[A] + cmathbb{E}[B]$ and
$operatorname{Var}(A + cB) = operatorname{Var}(A) + c^2 operatorname{Var}(B)$ (assuming $A$ and $B$ are independent).
Variance is defined in terms of the expectation. In particular, $operatorname{Var}(X) = mathbb{E}[(X - mathbb{E}[X])^2]$. See if you can use this definition to prove property (2) from property (1).
$endgroup$
add a comment |
$begingroup$
Expectation is a linear function. So,
$$E(sum_{i=1}^n k_iX_i)=sum_{i=1}^nk_iE(X_i)$$
Variance is " just like " squaring and covariance is "just like" multiplication. So, we can easily expand variance using identities of squares. In particular,
$$V(X_1+X_2)=V(X_1)+V(X_2)+2Cov(X_1,X_2)$$
which is analogous to $(a+b)^2=a^2+b^2+2ab$.
Similarly,
$$V(sum_{i=1}^n k_iX_i)=sum_{i=1}^nk_i^2V(X_i) + 2sum_{i=1}^nsum_{j=1}^{i-1}k_ik_jCov(X_iX_j)$$
Hope it is helpful
$endgroup$
$begingroup$
The section on variance using identities of squares is very clear. I haven't done the notation which you're using at the start and at the end yet, so I have no clue what you mean, sorry.
$endgroup$
– landlockedorca
Jan 28 at 9:35
$begingroup$
Which notation you are not getting?
$endgroup$
– Martund
Jan 29 at 2:39
$begingroup$
I've neither done covariance, nor have I seen the n and the i = 1 above the sum sign before.
$endgroup$
– landlockedorca
Jan 30 at 8:34
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3090341%2fadding-independent-random-variables-given-their-individual-expectations-and-vari%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
Let $A$ and $B$ be two random variables and $c$ be a constant. Then,
$mathbb{E}[A + cB] = mathbb{E}[A] + cmathbb{E}[B]$ and
$operatorname{Var}(A + cB) = operatorname{Var}(A) + c^2 operatorname{Var}(B)$ (assuming $A$ and $B$ are independent).
Variance is defined in terms of the expectation. In particular, $operatorname{Var}(X) = mathbb{E}[(X - mathbb{E}[X])^2]$. See if you can use this definition to prove property (2) from property (1).
$endgroup$
add a comment |
$begingroup$
Let $A$ and $B$ be two random variables and $c$ be a constant. Then,
$mathbb{E}[A + cB] = mathbb{E}[A] + cmathbb{E}[B]$ and
$operatorname{Var}(A + cB) = operatorname{Var}(A) + c^2 operatorname{Var}(B)$ (assuming $A$ and $B$ are independent).
Variance is defined in terms of the expectation. In particular, $operatorname{Var}(X) = mathbb{E}[(X - mathbb{E}[X])^2]$. See if you can use this definition to prove property (2) from property (1).
$endgroup$
add a comment |
$begingroup$
Let $A$ and $B$ be two random variables and $c$ be a constant. Then,
$mathbb{E}[A + cB] = mathbb{E}[A] + cmathbb{E}[B]$ and
$operatorname{Var}(A + cB) = operatorname{Var}(A) + c^2 operatorname{Var}(B)$ (assuming $A$ and $B$ are independent).
Variance is defined in terms of the expectation. In particular, $operatorname{Var}(X) = mathbb{E}[(X - mathbb{E}[X])^2]$. See if you can use this definition to prove property (2) from property (1).
$endgroup$
Let $A$ and $B$ be two random variables and $c$ be a constant. Then,
$mathbb{E}[A + cB] = mathbb{E}[A] + cmathbb{E}[B]$ and
$operatorname{Var}(A + cB) = operatorname{Var}(A) + c^2 operatorname{Var}(B)$ (assuming $A$ and $B$ are independent).
Variance is defined in terms of the expectation. In particular, $operatorname{Var}(X) = mathbb{E}[(X - mathbb{E}[X])^2]$. See if you can use this definition to prove property (2) from property (1).
edited Jan 28 at 5:46
answered Jan 28 at 1:48
parsiadparsiad
18.6k32453
18.6k32453
add a comment |
add a comment |
$begingroup$
Expectation is a linear function. So,
$$E(sum_{i=1}^n k_iX_i)=sum_{i=1}^nk_iE(X_i)$$
Variance is " just like " squaring and covariance is "just like" multiplication. So, we can easily expand variance using identities of squares. In particular,
$$V(X_1+X_2)=V(X_1)+V(X_2)+2Cov(X_1,X_2)$$
which is analogous to $(a+b)^2=a^2+b^2+2ab$.
Similarly,
$$V(sum_{i=1}^n k_iX_i)=sum_{i=1}^nk_i^2V(X_i) + 2sum_{i=1}^nsum_{j=1}^{i-1}k_ik_jCov(X_iX_j)$$
Hope it is helpful
$endgroup$
$begingroup$
The section on variance using identities of squares is very clear. I haven't done the notation which you're using at the start and at the end yet, so I have no clue what you mean, sorry.
$endgroup$
– landlockedorca
Jan 28 at 9:35
$begingroup$
Which notation you are not getting?
$endgroup$
– Martund
Jan 29 at 2:39
$begingroup$
I've neither done covariance, nor have I seen the n and the i = 1 above the sum sign before.
$endgroup$
– landlockedorca
Jan 30 at 8:34
add a comment |
$begingroup$
Expectation is a linear function. So,
$$E(sum_{i=1}^n k_iX_i)=sum_{i=1}^nk_iE(X_i)$$
Variance is " just like " squaring and covariance is "just like" multiplication. So, we can easily expand variance using identities of squares. In particular,
$$V(X_1+X_2)=V(X_1)+V(X_2)+2Cov(X_1,X_2)$$
which is analogous to $(a+b)^2=a^2+b^2+2ab$.
Similarly,
$$V(sum_{i=1}^n k_iX_i)=sum_{i=1}^nk_i^2V(X_i) + 2sum_{i=1}^nsum_{j=1}^{i-1}k_ik_jCov(X_iX_j)$$
Hope it is helpful
$endgroup$
$begingroup$
The section on variance using identities of squares is very clear. I haven't done the notation which you're using at the start and at the end yet, so I have no clue what you mean, sorry.
$endgroup$
– landlockedorca
Jan 28 at 9:35
$begingroup$
Which notation you are not getting?
$endgroup$
– Martund
Jan 29 at 2:39
$begingroup$
I've neither done covariance, nor have I seen the n and the i = 1 above the sum sign before.
$endgroup$
– landlockedorca
Jan 30 at 8:34
add a comment |
$begingroup$
Expectation is a linear function. So,
$$E(sum_{i=1}^n k_iX_i)=sum_{i=1}^nk_iE(X_i)$$
Variance is " just like " squaring and covariance is "just like" multiplication. So, we can easily expand variance using identities of squares. In particular,
$$V(X_1+X_2)=V(X_1)+V(X_2)+2Cov(X_1,X_2)$$
which is analogous to $(a+b)^2=a^2+b^2+2ab$.
Similarly,
$$V(sum_{i=1}^n k_iX_i)=sum_{i=1}^nk_i^2V(X_i) + 2sum_{i=1}^nsum_{j=1}^{i-1}k_ik_jCov(X_iX_j)$$
Hope it is helpful
$endgroup$
Expectation is a linear function. So,
$$E(sum_{i=1}^n k_iX_i)=sum_{i=1}^nk_iE(X_i)$$
Variance is " just like " squaring and covariance is "just like" multiplication. So, we can easily expand variance using identities of squares. In particular,
$$V(X_1+X_2)=V(X_1)+V(X_2)+2Cov(X_1,X_2)$$
which is analogous to $(a+b)^2=a^2+b^2+2ab$.
Similarly,
$$V(sum_{i=1}^n k_iX_i)=sum_{i=1}^nk_i^2V(X_i) + 2sum_{i=1}^nsum_{j=1}^{i-1}k_ik_jCov(X_iX_j)$$
Hope it is helpful
answered Jan 28 at 2:03
MartundMartund
1,667213
1,667213
$begingroup$
The section on variance using identities of squares is very clear. I haven't done the notation which you're using at the start and at the end yet, so I have no clue what you mean, sorry.
$endgroup$
– landlockedorca
Jan 28 at 9:35
$begingroup$
Which notation you are not getting?
$endgroup$
– Martund
Jan 29 at 2:39
$begingroup$
I've neither done covariance, nor have I seen the n and the i = 1 above the sum sign before.
$endgroup$
– landlockedorca
Jan 30 at 8:34
add a comment |
$begingroup$
The section on variance using identities of squares is very clear. I haven't done the notation which you're using at the start and at the end yet, so I have no clue what you mean, sorry.
$endgroup$
– landlockedorca
Jan 28 at 9:35
$begingroup$
Which notation you are not getting?
$endgroup$
– Martund
Jan 29 at 2:39
$begingroup$
I've neither done covariance, nor have I seen the n and the i = 1 above the sum sign before.
$endgroup$
– landlockedorca
Jan 30 at 8:34
$begingroup$
The section on variance using identities of squares is very clear. I haven't done the notation which you're using at the start and at the end yet, so I have no clue what you mean, sorry.
$endgroup$
– landlockedorca
Jan 28 at 9:35
$begingroup$
The section on variance using identities of squares is very clear. I haven't done the notation which you're using at the start and at the end yet, so I have no clue what you mean, sorry.
$endgroup$
– landlockedorca
Jan 28 at 9:35
$begingroup$
Which notation you are not getting?
$endgroup$
– Martund
Jan 29 at 2:39
$begingroup$
Which notation you are not getting?
$endgroup$
– Martund
Jan 29 at 2:39
$begingroup$
I've neither done covariance, nor have I seen the n and the i = 1 above the sum sign before.
$endgroup$
– landlockedorca
Jan 30 at 8:34
$begingroup$
I've neither done covariance, nor have I seen the n and the i = 1 above the sum sign before.
$endgroup$
– landlockedorca
Jan 30 at 8:34
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3090341%2fadding-independent-random-variables-given-their-individual-expectations-and-vari%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown