Given $Xsimtext{Binomial}(m,p)$ and $Ysimtext{Binomial}(n,p)$, calculate $textbf{P}(X = x | X + Y = s)$
$begingroup$
Let $X$ and $Y$ be independent RV's such that $Xsimtext{Binomial}(m,p)$ and $Ysimtext{Binomial}(n,p)$. Determine
(a) $textbf{P}(X = x mid X+Y = s)$
(b) $textbf{E}(Xmid X + Y)$ and $textbf{V}(X mid X+Y)$
(c) Check that $textbf{E}{textbf{E}(Xmid X+Y)} = textbf{E}(X)$
MY ATTEMPT
Firsly, I would like to determine the distribution of $Z = X+Y$. Since $X$ and $Y$ are independent,
begin{align*}
textbf{P}(Z = z) = textbf{P}(X + Y = z) = sum_{x=0}^{z}textbf{P}(X = x,Y = z - x) = sum_{x=0}^{z}textbf{P}(X = x)textbf{P}(Y = z-x)
end{align*}
Consequently, we have
begin{align*}
p_{Z}(z) & = textbf{P}(Z = z) = sum_{x=0}^{z}{nchoose z-x}p^{z-x}(1-p)^{n-z+x}{mchoose x}p^{x}(1-p)^{m-x}\
& = sum_{x=0}^{z}{nchoose z-x}{mchoose x}p^{z}(1-p)^{m + n - z} = {m+nchoose z}p^{z}(1-p)^{m+n-z}
end{align*}
Another possible approach is to consider each Binomial distribution as a sum of Bernoulli distributions. Therefore $Zsimtext{Binomial}(m+n,p)$. Based on this, could someone help me out?
EDIT
(a) As a consequence from the independence between $X$ and $Y$, we have
begin{align*}
textbf{P}(X = x | X + Y = s) & = frac{textbf{P}(X = x, X+Y = s)}{textbf{P}(X+Y = s)} = frac{textbf{P}(X = x)textbf{P}(Y = s - x)}{textbf{P}(X+Y = s)}\\
& = frac{displaystyle{mchoose x}{nchoose s-x}p^{s}(1-p)^{m+n-s}}{displaystyle{m+nchoose s}p^{s}(1-p)^{m+n-s}} = frac{displaystyle{mchoose x}{nchoose s - x}}{displaystyle{m+nchoose s}}
end{align*}
(b) According to wikipedia (Hypergeometric random variable), we have
begin{align*}
textbf{E}(X | X + Y) = frac{sm}{m+n}quadtext{and}quadtextbf{V}(X | X+Y) = ldots
end{align*}
(c) Finally, we have
begin{align*}
textbf{E}{textbf{E}(X|X+Y)} = frac{m}{m+n}textbf{E}(X+Y) = frac{m(m+n)p}{m+n} = mp = textbf{E}(X)
end{align*}
probability probability-theory conditional-expectation variance expected-value
$endgroup$
|
show 1 more comment
$begingroup$
Let $X$ and $Y$ be independent RV's such that $Xsimtext{Binomial}(m,p)$ and $Ysimtext{Binomial}(n,p)$. Determine
(a) $textbf{P}(X = x mid X+Y = s)$
(b) $textbf{E}(Xmid X + Y)$ and $textbf{V}(X mid X+Y)$
(c) Check that $textbf{E}{textbf{E}(Xmid X+Y)} = textbf{E}(X)$
MY ATTEMPT
Firsly, I would like to determine the distribution of $Z = X+Y$. Since $X$ and $Y$ are independent,
begin{align*}
textbf{P}(Z = z) = textbf{P}(X + Y = z) = sum_{x=0}^{z}textbf{P}(X = x,Y = z - x) = sum_{x=0}^{z}textbf{P}(X = x)textbf{P}(Y = z-x)
end{align*}
Consequently, we have
begin{align*}
p_{Z}(z) & = textbf{P}(Z = z) = sum_{x=0}^{z}{nchoose z-x}p^{z-x}(1-p)^{n-z+x}{mchoose x}p^{x}(1-p)^{m-x}\
& = sum_{x=0}^{z}{nchoose z-x}{mchoose x}p^{z}(1-p)^{m + n - z} = {m+nchoose z}p^{z}(1-p)^{m+n-z}
end{align*}
Another possible approach is to consider each Binomial distribution as a sum of Bernoulli distributions. Therefore $Zsimtext{Binomial}(m+n,p)$. Based on this, could someone help me out?
EDIT
(a) As a consequence from the independence between $X$ and $Y$, we have
begin{align*}
textbf{P}(X = x | X + Y = s) & = frac{textbf{P}(X = x, X+Y = s)}{textbf{P}(X+Y = s)} = frac{textbf{P}(X = x)textbf{P}(Y = s - x)}{textbf{P}(X+Y = s)}\\
& = frac{displaystyle{mchoose x}{nchoose s-x}p^{s}(1-p)^{m+n-s}}{displaystyle{m+nchoose s}p^{s}(1-p)^{m+n-s}} = frac{displaystyle{mchoose x}{nchoose s - x}}{displaystyle{m+nchoose s}}
end{align*}
(b) According to wikipedia (Hypergeometric random variable), we have
begin{align*}
textbf{E}(X | X + Y) = frac{sm}{m+n}quadtext{and}quadtextbf{V}(X | X+Y) = ldots
end{align*}
(c) Finally, we have
begin{align*}
textbf{E}{textbf{E}(X|X+Y)} = frac{m}{m+n}textbf{E}(X+Y) = frac{m(m+n)p}{m+n} = mp = textbf{E}(X)
end{align*}
probability probability-theory conditional-expectation variance expected-value
$endgroup$
2
$begingroup$
You definitely can consider each binomial as the sum of bernoulli's. If you choose to go the way of densities, what will really help is to consider Vandermonde’s identity (first discovered by Zhu Shijie in 1303 in Ancient China): $$ sum_{j=0}^k{nchoose k-j} {mchoose j}= {m+n choose k} $$
$endgroup$
– user321627
Jan 26 at 23:38
$begingroup$
I would also check the summation indices you have above.
$endgroup$
– user321627
Jan 27 at 0:15
$begingroup$
There are some conceptual mistakes in your attempt and edit. The second equality in the "My Attempt" is wrong (what you get next would be correct, using Vandermonde's identity, but that's not a consequence of what you wrote just before). The same is true also for the first equality in "My Edit" (here, use first the definition of conditional probability).
$endgroup$
– user52227
Jan 27 at 13:33
$begingroup$
Thanks for the contributions, user52227. I think I have implemented them correctly this time. By the way, could you provide an answer to the last updated question?
$endgroup$
– user1337
Jan 27 at 18:32
$begingroup$
Where's the last updated question?
$endgroup$
– user321627
Jan 27 at 21:41
|
show 1 more comment
$begingroup$
Let $X$ and $Y$ be independent RV's such that $Xsimtext{Binomial}(m,p)$ and $Ysimtext{Binomial}(n,p)$. Determine
(a) $textbf{P}(X = x mid X+Y = s)$
(b) $textbf{E}(Xmid X + Y)$ and $textbf{V}(X mid X+Y)$
(c) Check that $textbf{E}{textbf{E}(Xmid X+Y)} = textbf{E}(X)$
MY ATTEMPT
Firsly, I would like to determine the distribution of $Z = X+Y$. Since $X$ and $Y$ are independent,
begin{align*}
textbf{P}(Z = z) = textbf{P}(X + Y = z) = sum_{x=0}^{z}textbf{P}(X = x,Y = z - x) = sum_{x=0}^{z}textbf{P}(X = x)textbf{P}(Y = z-x)
end{align*}
Consequently, we have
begin{align*}
p_{Z}(z) & = textbf{P}(Z = z) = sum_{x=0}^{z}{nchoose z-x}p^{z-x}(1-p)^{n-z+x}{mchoose x}p^{x}(1-p)^{m-x}\
& = sum_{x=0}^{z}{nchoose z-x}{mchoose x}p^{z}(1-p)^{m + n - z} = {m+nchoose z}p^{z}(1-p)^{m+n-z}
end{align*}
Another possible approach is to consider each Binomial distribution as a sum of Bernoulli distributions. Therefore $Zsimtext{Binomial}(m+n,p)$. Based on this, could someone help me out?
EDIT
(a) As a consequence from the independence between $X$ and $Y$, we have
begin{align*}
textbf{P}(X = x | X + Y = s) & = frac{textbf{P}(X = x, X+Y = s)}{textbf{P}(X+Y = s)} = frac{textbf{P}(X = x)textbf{P}(Y = s - x)}{textbf{P}(X+Y = s)}\\
& = frac{displaystyle{mchoose x}{nchoose s-x}p^{s}(1-p)^{m+n-s}}{displaystyle{m+nchoose s}p^{s}(1-p)^{m+n-s}} = frac{displaystyle{mchoose x}{nchoose s - x}}{displaystyle{m+nchoose s}}
end{align*}
(b) According to wikipedia (Hypergeometric random variable), we have
begin{align*}
textbf{E}(X | X + Y) = frac{sm}{m+n}quadtext{and}quadtextbf{V}(X | X+Y) = ldots
end{align*}
(c) Finally, we have
begin{align*}
textbf{E}{textbf{E}(X|X+Y)} = frac{m}{m+n}textbf{E}(X+Y) = frac{m(m+n)p}{m+n} = mp = textbf{E}(X)
end{align*}
probability probability-theory conditional-expectation variance expected-value
$endgroup$
Let $X$ and $Y$ be independent RV's such that $Xsimtext{Binomial}(m,p)$ and $Ysimtext{Binomial}(n,p)$. Determine
(a) $textbf{P}(X = x mid X+Y = s)$
(b) $textbf{E}(Xmid X + Y)$ and $textbf{V}(X mid X+Y)$
(c) Check that $textbf{E}{textbf{E}(Xmid X+Y)} = textbf{E}(X)$
MY ATTEMPT
Firsly, I would like to determine the distribution of $Z = X+Y$. Since $X$ and $Y$ are independent,
begin{align*}
textbf{P}(Z = z) = textbf{P}(X + Y = z) = sum_{x=0}^{z}textbf{P}(X = x,Y = z - x) = sum_{x=0}^{z}textbf{P}(X = x)textbf{P}(Y = z-x)
end{align*}
Consequently, we have
begin{align*}
p_{Z}(z) & = textbf{P}(Z = z) = sum_{x=0}^{z}{nchoose z-x}p^{z-x}(1-p)^{n-z+x}{mchoose x}p^{x}(1-p)^{m-x}\
& = sum_{x=0}^{z}{nchoose z-x}{mchoose x}p^{z}(1-p)^{m + n - z} = {m+nchoose z}p^{z}(1-p)^{m+n-z}
end{align*}
Another possible approach is to consider each Binomial distribution as a sum of Bernoulli distributions. Therefore $Zsimtext{Binomial}(m+n,p)$. Based on this, could someone help me out?
EDIT
(a) As a consequence from the independence between $X$ and $Y$, we have
begin{align*}
textbf{P}(X = x | X + Y = s) & = frac{textbf{P}(X = x, X+Y = s)}{textbf{P}(X+Y = s)} = frac{textbf{P}(X = x)textbf{P}(Y = s - x)}{textbf{P}(X+Y = s)}\\
& = frac{displaystyle{mchoose x}{nchoose s-x}p^{s}(1-p)^{m+n-s}}{displaystyle{m+nchoose s}p^{s}(1-p)^{m+n-s}} = frac{displaystyle{mchoose x}{nchoose s - x}}{displaystyle{m+nchoose s}}
end{align*}
(b) According to wikipedia (Hypergeometric random variable), we have
begin{align*}
textbf{E}(X | X + Y) = frac{sm}{m+n}quadtext{and}quadtextbf{V}(X | X+Y) = ldots
end{align*}
(c) Finally, we have
begin{align*}
textbf{E}{textbf{E}(X|X+Y)} = frac{m}{m+n}textbf{E}(X+Y) = frac{m(m+n)p}{m+n} = mp = textbf{E}(X)
end{align*}
probability probability-theory conditional-expectation variance expected-value
probability probability-theory conditional-expectation variance expected-value
edited Jan 27 at 20:40
user1337
asked Jan 26 at 20:30
user1337user1337
46310
46310
2
$begingroup$
You definitely can consider each binomial as the sum of bernoulli's. If you choose to go the way of densities, what will really help is to consider Vandermonde’s identity (first discovered by Zhu Shijie in 1303 in Ancient China): $$ sum_{j=0}^k{nchoose k-j} {mchoose j}= {m+n choose k} $$
$endgroup$
– user321627
Jan 26 at 23:38
$begingroup$
I would also check the summation indices you have above.
$endgroup$
– user321627
Jan 27 at 0:15
$begingroup$
There are some conceptual mistakes in your attempt and edit. The second equality in the "My Attempt" is wrong (what you get next would be correct, using Vandermonde's identity, but that's not a consequence of what you wrote just before). The same is true also for the first equality in "My Edit" (here, use first the definition of conditional probability).
$endgroup$
– user52227
Jan 27 at 13:33
$begingroup$
Thanks for the contributions, user52227. I think I have implemented them correctly this time. By the way, could you provide an answer to the last updated question?
$endgroup$
– user1337
Jan 27 at 18:32
$begingroup$
Where's the last updated question?
$endgroup$
– user321627
Jan 27 at 21:41
|
show 1 more comment
2
$begingroup$
You definitely can consider each binomial as the sum of bernoulli's. If you choose to go the way of densities, what will really help is to consider Vandermonde’s identity (first discovered by Zhu Shijie in 1303 in Ancient China): $$ sum_{j=0}^k{nchoose k-j} {mchoose j}= {m+n choose k} $$
$endgroup$
– user321627
Jan 26 at 23:38
$begingroup$
I would also check the summation indices you have above.
$endgroup$
– user321627
Jan 27 at 0:15
$begingroup$
There are some conceptual mistakes in your attempt and edit. The second equality in the "My Attempt" is wrong (what you get next would be correct, using Vandermonde's identity, but that's not a consequence of what you wrote just before). The same is true also for the first equality in "My Edit" (here, use first the definition of conditional probability).
$endgroup$
– user52227
Jan 27 at 13:33
$begingroup$
Thanks for the contributions, user52227. I think I have implemented them correctly this time. By the way, could you provide an answer to the last updated question?
$endgroup$
– user1337
Jan 27 at 18:32
$begingroup$
Where's the last updated question?
$endgroup$
– user321627
Jan 27 at 21:41
2
2
$begingroup$
You definitely can consider each binomial as the sum of bernoulli's. If you choose to go the way of densities, what will really help is to consider Vandermonde’s identity (first discovered by Zhu Shijie in 1303 in Ancient China): $$ sum_{j=0}^k{nchoose k-j} {mchoose j}= {m+n choose k} $$
$endgroup$
– user321627
Jan 26 at 23:38
$begingroup$
You definitely can consider each binomial as the sum of bernoulli's. If you choose to go the way of densities, what will really help is to consider Vandermonde’s identity (first discovered by Zhu Shijie in 1303 in Ancient China): $$ sum_{j=0}^k{nchoose k-j} {mchoose j}= {m+n choose k} $$
$endgroup$
– user321627
Jan 26 at 23:38
$begingroup$
I would also check the summation indices you have above.
$endgroup$
– user321627
Jan 27 at 0:15
$begingroup$
I would also check the summation indices you have above.
$endgroup$
– user321627
Jan 27 at 0:15
$begingroup$
There are some conceptual mistakes in your attempt and edit. The second equality in the "My Attempt" is wrong (what you get next would be correct, using Vandermonde's identity, but that's not a consequence of what you wrote just before). The same is true also for the first equality in "My Edit" (here, use first the definition of conditional probability).
$endgroup$
– user52227
Jan 27 at 13:33
$begingroup$
There are some conceptual mistakes in your attempt and edit. The second equality in the "My Attempt" is wrong (what you get next would be correct, using Vandermonde's identity, but that's not a consequence of what you wrote just before). The same is true also for the first equality in "My Edit" (here, use first the definition of conditional probability).
$endgroup$
– user52227
Jan 27 at 13:33
$begingroup$
Thanks for the contributions, user52227. I think I have implemented them correctly this time. By the way, could you provide an answer to the last updated question?
$endgroup$
– user1337
Jan 27 at 18:32
$begingroup$
Thanks for the contributions, user52227. I think I have implemented them correctly this time. By the way, could you provide an answer to the last updated question?
$endgroup$
– user1337
Jan 27 at 18:32
$begingroup$
Where's the last updated question?
$endgroup$
– user321627
Jan 27 at 21:41
$begingroup$
Where's the last updated question?
$endgroup$
– user321627
Jan 27 at 21:41
|
show 1 more comment
0
active
oldest
votes
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3088722%2fgiven-x-sim-textbinomialm-p-and-y-sim-textbinomialn-p-calculate-t%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
0
active
oldest
votes
0
active
oldest
votes
active
oldest
votes
active
oldest
votes
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3088722%2fgiven-x-sim-textbinomialm-p-and-y-sim-textbinomialn-p-calculate-t%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
2
$begingroup$
You definitely can consider each binomial as the sum of bernoulli's. If you choose to go the way of densities, what will really help is to consider Vandermonde’s identity (first discovered by Zhu Shijie in 1303 in Ancient China): $$ sum_{j=0}^k{nchoose k-j} {mchoose j}= {m+n choose k} $$
$endgroup$
– user321627
Jan 26 at 23:38
$begingroup$
I would also check the summation indices you have above.
$endgroup$
– user321627
Jan 27 at 0:15
$begingroup$
There are some conceptual mistakes in your attempt and edit. The second equality in the "My Attempt" is wrong (what you get next would be correct, using Vandermonde's identity, but that's not a consequence of what you wrote just before). The same is true also for the first equality in "My Edit" (here, use first the definition of conditional probability).
$endgroup$
– user52227
Jan 27 at 13:33
$begingroup$
Thanks for the contributions, user52227. I think I have implemented them correctly this time. By the way, could you provide an answer to the last updated question?
$endgroup$
– user1337
Jan 27 at 18:32
$begingroup$
Where's the last updated question?
$endgroup$
– user321627
Jan 27 at 21:41