Finding the mean and variance of the number of successes of a sequence of independent trials.
$begingroup$
In a sequence of $n$ independent trials the probability of a success at the $i^{mathrm{th}}$ trial is $p_i$. Find the mean and variance of the total number of successes.
My problem is should I let $X_i$ be the event that the $i^{mathrm{th}}$ is a success or that $i$ trials have been successful, where $X=X_1+X_2+cdots +X_n$.
probability
$endgroup$
add a comment |
$begingroup$
In a sequence of $n$ independent trials the probability of a success at the $i^{mathrm{th}}$ trial is $p_i$. Find the mean and variance of the total number of successes.
My problem is should I let $X_i$ be the event that the $i^{mathrm{th}}$ is a success or that $i$ trials have been successful, where $X=X_1+X_2+cdots +X_n$.
probability
$endgroup$
$begingroup$
Your idea is a useful one. We define the random variable $X_i$ by $X_i=1$ if we have a success on the $i$-th trial, and by $X_i=0$ otherwise. Then the number $X$ of successes is given by $X=X_1+cdots+X_n$. Now $E(X)$ is immediate by the linearity of expectation. For the variance, it will be a good idea to expand $(X_1+cdots+X_n)^2$.
$endgroup$
– André Nicolas
Feb 24 '15 at 17:14
$begingroup$
thank you, I think I can see where to go from here!
$endgroup$
– guest10923
Feb 24 '15 at 17:36
$begingroup$
You are welcome. I thought it best to outline things only, so that you could do the rest. Note that there is a simpler way to get at the variance, since we are dealing with an independent sum.
$endgroup$
– André Nicolas
Feb 24 '15 at 17:40
$begingroup$
Hmm, I can't find a way to tidy up the (X1+...+Xn)^2 expression. I tried to use the fact the events are independent therefore E(XY)=E(X)E(Y). I am not sure I know a simpler formula for the variance.
$endgroup$
– guest10923
Feb 24 '15 at 18:00
$begingroup$
The simple way is to use $text{Var}(X)=sum text{Var}(X_i)$. An easy computation (or standard fact) shows that $text{Var}(X_i)=p_i(1-p_i)$. The harder way is to expand. The mean of $X_i^2$ is $p_i$ since $X_i^2=X_i$. The cross terms have expectation $2sum_{ilt j}p_ip_j$. So the expectation of $X^2$ is $sum p_i+2sum_{ilt j}p_ip_j$. Subtract $(E(X))^2$. We get a messy expression that simplifies a lot.
$endgroup$
– André Nicolas
Feb 24 '15 at 18:12
add a comment |
$begingroup$
In a sequence of $n$ independent trials the probability of a success at the $i^{mathrm{th}}$ trial is $p_i$. Find the mean and variance of the total number of successes.
My problem is should I let $X_i$ be the event that the $i^{mathrm{th}}$ is a success or that $i$ trials have been successful, where $X=X_1+X_2+cdots +X_n$.
probability
$endgroup$
In a sequence of $n$ independent trials the probability of a success at the $i^{mathrm{th}}$ trial is $p_i$. Find the mean and variance of the total number of successes.
My problem is should I let $X_i$ be the event that the $i^{mathrm{th}}$ is a success or that $i$ trials have been successful, where $X=X_1+X_2+cdots +X_n$.
probability
probability
edited Feb 24 '15 at 20:17
Math1000
19.3k31745
19.3k31745
asked Feb 24 '15 at 17:08
guest10923guest10923
2418
2418
$begingroup$
Your idea is a useful one. We define the random variable $X_i$ by $X_i=1$ if we have a success on the $i$-th trial, and by $X_i=0$ otherwise. Then the number $X$ of successes is given by $X=X_1+cdots+X_n$. Now $E(X)$ is immediate by the linearity of expectation. For the variance, it will be a good idea to expand $(X_1+cdots+X_n)^2$.
$endgroup$
– André Nicolas
Feb 24 '15 at 17:14
$begingroup$
thank you, I think I can see where to go from here!
$endgroup$
– guest10923
Feb 24 '15 at 17:36
$begingroup$
You are welcome. I thought it best to outline things only, so that you could do the rest. Note that there is a simpler way to get at the variance, since we are dealing with an independent sum.
$endgroup$
– André Nicolas
Feb 24 '15 at 17:40
$begingroup$
Hmm, I can't find a way to tidy up the (X1+...+Xn)^2 expression. I tried to use the fact the events are independent therefore E(XY)=E(X)E(Y). I am not sure I know a simpler formula for the variance.
$endgroup$
– guest10923
Feb 24 '15 at 18:00
$begingroup$
The simple way is to use $text{Var}(X)=sum text{Var}(X_i)$. An easy computation (or standard fact) shows that $text{Var}(X_i)=p_i(1-p_i)$. The harder way is to expand. The mean of $X_i^2$ is $p_i$ since $X_i^2=X_i$. The cross terms have expectation $2sum_{ilt j}p_ip_j$. So the expectation of $X^2$ is $sum p_i+2sum_{ilt j}p_ip_j$. Subtract $(E(X))^2$. We get a messy expression that simplifies a lot.
$endgroup$
– André Nicolas
Feb 24 '15 at 18:12
add a comment |
$begingroup$
Your idea is a useful one. We define the random variable $X_i$ by $X_i=1$ if we have a success on the $i$-th trial, and by $X_i=0$ otherwise. Then the number $X$ of successes is given by $X=X_1+cdots+X_n$. Now $E(X)$ is immediate by the linearity of expectation. For the variance, it will be a good idea to expand $(X_1+cdots+X_n)^2$.
$endgroup$
– André Nicolas
Feb 24 '15 at 17:14
$begingroup$
thank you, I think I can see where to go from here!
$endgroup$
– guest10923
Feb 24 '15 at 17:36
$begingroup$
You are welcome. I thought it best to outline things only, so that you could do the rest. Note that there is a simpler way to get at the variance, since we are dealing with an independent sum.
$endgroup$
– André Nicolas
Feb 24 '15 at 17:40
$begingroup$
Hmm, I can't find a way to tidy up the (X1+...+Xn)^2 expression. I tried to use the fact the events are independent therefore E(XY)=E(X)E(Y). I am not sure I know a simpler formula for the variance.
$endgroup$
– guest10923
Feb 24 '15 at 18:00
$begingroup$
The simple way is to use $text{Var}(X)=sum text{Var}(X_i)$. An easy computation (or standard fact) shows that $text{Var}(X_i)=p_i(1-p_i)$. The harder way is to expand. The mean of $X_i^2$ is $p_i$ since $X_i^2=X_i$. The cross terms have expectation $2sum_{ilt j}p_ip_j$. So the expectation of $X^2$ is $sum p_i+2sum_{ilt j}p_ip_j$. Subtract $(E(X))^2$. We get a messy expression that simplifies a lot.
$endgroup$
– André Nicolas
Feb 24 '15 at 18:12
$begingroup$
Your idea is a useful one. We define the random variable $X_i$ by $X_i=1$ if we have a success on the $i$-th trial, and by $X_i=0$ otherwise. Then the number $X$ of successes is given by $X=X_1+cdots+X_n$. Now $E(X)$ is immediate by the linearity of expectation. For the variance, it will be a good idea to expand $(X_1+cdots+X_n)^2$.
$endgroup$
– André Nicolas
Feb 24 '15 at 17:14
$begingroup$
Your idea is a useful one. We define the random variable $X_i$ by $X_i=1$ if we have a success on the $i$-th trial, and by $X_i=0$ otherwise. Then the number $X$ of successes is given by $X=X_1+cdots+X_n$. Now $E(X)$ is immediate by the linearity of expectation. For the variance, it will be a good idea to expand $(X_1+cdots+X_n)^2$.
$endgroup$
– André Nicolas
Feb 24 '15 at 17:14
$begingroup$
thank you, I think I can see where to go from here!
$endgroup$
– guest10923
Feb 24 '15 at 17:36
$begingroup$
thank you, I think I can see where to go from here!
$endgroup$
– guest10923
Feb 24 '15 at 17:36
$begingroup$
You are welcome. I thought it best to outline things only, so that you could do the rest. Note that there is a simpler way to get at the variance, since we are dealing with an independent sum.
$endgroup$
– André Nicolas
Feb 24 '15 at 17:40
$begingroup$
You are welcome. I thought it best to outline things only, so that you could do the rest. Note that there is a simpler way to get at the variance, since we are dealing with an independent sum.
$endgroup$
– André Nicolas
Feb 24 '15 at 17:40
$begingroup$
Hmm, I can't find a way to tidy up the (X1+...+Xn)^2 expression. I tried to use the fact the events are independent therefore E(XY)=E(X)E(Y). I am not sure I know a simpler formula for the variance.
$endgroup$
– guest10923
Feb 24 '15 at 18:00
$begingroup$
Hmm, I can't find a way to tidy up the (X1+...+Xn)^2 expression. I tried to use the fact the events are independent therefore E(XY)=E(X)E(Y). I am not sure I know a simpler formula for the variance.
$endgroup$
– guest10923
Feb 24 '15 at 18:00
$begingroup$
The simple way is to use $text{Var}(X)=sum text{Var}(X_i)$. An easy computation (or standard fact) shows that $text{Var}(X_i)=p_i(1-p_i)$. The harder way is to expand. The mean of $X_i^2$ is $p_i$ since $X_i^2=X_i$. The cross terms have expectation $2sum_{ilt j}p_ip_j$. So the expectation of $X^2$ is $sum p_i+2sum_{ilt j}p_ip_j$. Subtract $(E(X))^2$. We get a messy expression that simplifies a lot.
$endgroup$
– André Nicolas
Feb 24 '15 at 18:12
$begingroup$
The simple way is to use $text{Var}(X)=sum text{Var}(X_i)$. An easy computation (or standard fact) shows that $text{Var}(X_i)=p_i(1-p_i)$. The harder way is to expand. The mean of $X_i^2$ is $p_i$ since $X_i^2=X_i$. The cross terms have expectation $2sum_{ilt j}p_ip_j$. So the expectation of $X^2$ is $sum p_i+2sum_{ilt j}p_ip_j$. Subtract $(E(X))^2$. We get a messy expression that simplifies a lot.
$endgroup$
– André Nicolas
Feb 24 '15 at 18:12
add a comment |
1 Answer
1
active
oldest
votes
$begingroup$
Recall that for independent random variables $X_i$:
$$mathbb Eleft[sum_{i=1}^n X_iright] = sum_{i=1}^nmathbb E[X_i] $$
and
$$operatorname{Var}left(sum_{i=1}^n X_iright) = sum_{i=1}^n operatorname{Var}(X_i). $$
Applying these with $mathbb E[X_i]=p_i$ and $operatorname{Var}(X_i)=p_i(1-p_i)$ we get that the mean and variance of the sum are
$$sum_{i=1}^n p_i$$
and
$$ sum_{i=1}^n p_i(1-p_i),$$
respectively.
$endgroup$
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f1163508%2ffinding-the-mean-and-variance-of-the-number-of-successes-of-a-sequence-of-indepe%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
Recall that for independent random variables $X_i$:
$$mathbb Eleft[sum_{i=1}^n X_iright] = sum_{i=1}^nmathbb E[X_i] $$
and
$$operatorname{Var}left(sum_{i=1}^n X_iright) = sum_{i=1}^n operatorname{Var}(X_i). $$
Applying these with $mathbb E[X_i]=p_i$ and $operatorname{Var}(X_i)=p_i(1-p_i)$ we get that the mean and variance of the sum are
$$sum_{i=1}^n p_i$$
and
$$ sum_{i=1}^n p_i(1-p_i),$$
respectively.
$endgroup$
add a comment |
$begingroup$
Recall that for independent random variables $X_i$:
$$mathbb Eleft[sum_{i=1}^n X_iright] = sum_{i=1}^nmathbb E[X_i] $$
and
$$operatorname{Var}left(sum_{i=1}^n X_iright) = sum_{i=1}^n operatorname{Var}(X_i). $$
Applying these with $mathbb E[X_i]=p_i$ and $operatorname{Var}(X_i)=p_i(1-p_i)$ we get that the mean and variance of the sum are
$$sum_{i=1}^n p_i$$
and
$$ sum_{i=1}^n p_i(1-p_i),$$
respectively.
$endgroup$
add a comment |
$begingroup$
Recall that for independent random variables $X_i$:
$$mathbb Eleft[sum_{i=1}^n X_iright] = sum_{i=1}^nmathbb E[X_i] $$
and
$$operatorname{Var}left(sum_{i=1}^n X_iright) = sum_{i=1}^n operatorname{Var}(X_i). $$
Applying these with $mathbb E[X_i]=p_i$ and $operatorname{Var}(X_i)=p_i(1-p_i)$ we get that the mean and variance of the sum are
$$sum_{i=1}^n p_i$$
and
$$ sum_{i=1}^n p_i(1-p_i),$$
respectively.
$endgroup$
Recall that for independent random variables $X_i$:
$$mathbb Eleft[sum_{i=1}^n X_iright] = sum_{i=1}^nmathbb E[X_i] $$
and
$$operatorname{Var}left(sum_{i=1}^n X_iright) = sum_{i=1}^n operatorname{Var}(X_i). $$
Applying these with $mathbb E[X_i]=p_i$ and $operatorname{Var}(X_i)=p_i(1-p_i)$ we get that the mean and variance of the sum are
$$sum_{i=1}^n p_i$$
and
$$ sum_{i=1}^n p_i(1-p_i),$$
respectively.
answered Feb 24 '15 at 20:41
Math1000Math1000
19.3k31745
19.3k31745
add a comment |
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f1163508%2ffinding-the-mean-and-variance-of-the-number-of-successes-of-a-sequence-of-indepe%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
$begingroup$
Your idea is a useful one. We define the random variable $X_i$ by $X_i=1$ if we have a success on the $i$-th trial, and by $X_i=0$ otherwise. Then the number $X$ of successes is given by $X=X_1+cdots+X_n$. Now $E(X)$ is immediate by the linearity of expectation. For the variance, it will be a good idea to expand $(X_1+cdots+X_n)^2$.
$endgroup$
– André Nicolas
Feb 24 '15 at 17:14
$begingroup$
thank you, I think I can see where to go from here!
$endgroup$
– guest10923
Feb 24 '15 at 17:36
$begingroup$
You are welcome. I thought it best to outline things only, so that you could do the rest. Note that there is a simpler way to get at the variance, since we are dealing with an independent sum.
$endgroup$
– André Nicolas
Feb 24 '15 at 17:40
$begingroup$
Hmm, I can't find a way to tidy up the (X1+...+Xn)^2 expression. I tried to use the fact the events are independent therefore E(XY)=E(X)E(Y). I am not sure I know a simpler formula for the variance.
$endgroup$
– guest10923
Feb 24 '15 at 18:00
$begingroup$
The simple way is to use $text{Var}(X)=sum text{Var}(X_i)$. An easy computation (or standard fact) shows that $text{Var}(X_i)=p_i(1-p_i)$. The harder way is to expand. The mean of $X_i^2$ is $p_i$ since $X_i^2=X_i$. The cross terms have expectation $2sum_{ilt j}p_ip_j$. So the expectation of $X^2$ is $sum p_i+2sum_{ilt j}p_ip_j$. Subtract $(E(X))^2$. We get a messy expression that simplifies a lot.
$endgroup$
– André Nicolas
Feb 24 '15 at 18:12