Sum of two multinomial random variables
$begingroup$
I have two independent multinomial random variables $Y_1$ and $Y_2$. I have to find the distribution of
$$X=Y_1+Y_2$$
$$Y_1 - text{Multinomial}(n_1,(p_1,p_2...p_k))$$
$$Y_2 - text{Multinomial}(n_2,(p_1,p_2...p_k))$$
I tried using the convolution to calculate the distribution but got stuck after a while
$$P(x_1,x_2..x_k) = sum_{y_1,y_2..y_n} binom{n_1}{y_1 y_2..y_k}p_1^{y_1}p_2^{y_2}..p_k^{y_k} binom{n_2}{(x_1-y_1) (x_2-y_2)..(x_k-y_k)}p_1^{x_1-y_1}p_2^{x_2-y_2}..p_k^{x_k-y_k}$$
such that $y_1+y_2+...+y_n = n_1$ and by similar reasoning we see that $x_1+x_2+...+x_n=n_1+n_2$
$$P(x_1,x_2..x_k) = p_1^{x_1}p_2^{x_2}...p_k^{x_k}sum_{y_1,y_2..y_n} binom{n_1}{y_1 y_2..y_k} binom{n_2}{(x_1-y_1) (x_2-y_2)...(x_k-y_k)}$$
$$P(x_1,x_2..x_k) = (n_1!)(n_2!) p_1^{x_1}p_2^{x_2}...p_k^{x_k}sum_{y_1,y_2..y_n} frac{1}{y_1! y_2!..y_k!} cdotfrac{1}{(x_1-y_1)! (x_2-y_2)!...(x_k-y_k)!}$$
$$P(x_1,x_2..x_k) = frac{(n_1!)(n_2!) p_1^{x_1}p_2^{x_2}...p_k^{x_k}}{x_1! x_2!..x_k!}sum_{y_1,y_2..y_n} binom{x_1}{y_1}binom{x_2}{y_2}...binom{x_k}{y_k}$$
But after this I couldn't solve it. Please help
probability statistics probability-distributions
$endgroup$
add a comment |
$begingroup$
I have two independent multinomial random variables $Y_1$ and $Y_2$. I have to find the distribution of
$$X=Y_1+Y_2$$
$$Y_1 - text{Multinomial}(n_1,(p_1,p_2...p_k))$$
$$Y_2 - text{Multinomial}(n_2,(p_1,p_2...p_k))$$
I tried using the convolution to calculate the distribution but got stuck after a while
$$P(x_1,x_2..x_k) = sum_{y_1,y_2..y_n} binom{n_1}{y_1 y_2..y_k}p_1^{y_1}p_2^{y_2}..p_k^{y_k} binom{n_2}{(x_1-y_1) (x_2-y_2)..(x_k-y_k)}p_1^{x_1-y_1}p_2^{x_2-y_2}..p_k^{x_k-y_k}$$
such that $y_1+y_2+...+y_n = n_1$ and by similar reasoning we see that $x_1+x_2+...+x_n=n_1+n_2$
$$P(x_1,x_2..x_k) = p_1^{x_1}p_2^{x_2}...p_k^{x_k}sum_{y_1,y_2..y_n} binom{n_1}{y_1 y_2..y_k} binom{n_2}{(x_1-y_1) (x_2-y_2)...(x_k-y_k)}$$
$$P(x_1,x_2..x_k) = (n_1!)(n_2!) p_1^{x_1}p_2^{x_2}...p_k^{x_k}sum_{y_1,y_2..y_n} frac{1}{y_1! y_2!..y_k!} cdotfrac{1}{(x_1-y_1)! (x_2-y_2)!...(x_k-y_k)!}$$
$$P(x_1,x_2..x_k) = frac{(n_1!)(n_2!) p_1^{x_1}p_2^{x_2}...p_k^{x_k}}{x_1! x_2!..x_k!}sum_{y_1,y_2..y_n} binom{x_1}{y_1}binom{x_2}{y_2}...binom{x_k}{y_k}$$
But after this I couldn't solve it. Please help
probability statistics probability-distributions
$endgroup$
$begingroup$
Are they are independent?
$endgroup$
– Henry
Jan 17 at 8:33
$begingroup$
Yeah. They are independent.
$endgroup$
– Sauhard Sharma
Jan 17 at 9:04
$begingroup$
Then, as $Y_1$ is the sum of $n_1$ independent $text{Multinomial}(1,(p_1,p_2...p_k))$ and $Y_2$ is the sum of $n_2$ independent $text{Multinomial}(1,(p_1,p_2...p_k))$, you find $Y_1+Y_2$ is the sum of $n_1+n_2$ independent $text{Multinomial}(1,(p_1,p_2...p_k))$ which is $text{Multinomial}(n_1+n_2,(p_1,p_2...p_k))$
$endgroup$
– Henry
Jan 17 at 10:19
$begingroup$
How can you say that sum of $n_1$ independent Multinomial$(1,(p_1,p_2...p_k))$ is equal to $(n_1,(p_1,p_2...p_k))$. Could you please provide any reference text for this ?
$endgroup$
– Sauhard Sharma
Jan 17 at 10:57
$begingroup$
It may depend on your definition of $text{Multinomial}(n,(p_1,p_2...p_k))$. Wikipedia says "For $n$ independent trials each of which leads to a success for exactly one of $k$ categories, with each category having a given fixed success probability, the multinomial distribution gives the probability of any particular combination of numbers of successes for the various categories" which I would have thought makes my point
$endgroup$
– Henry
Jan 17 at 11:03
add a comment |
$begingroup$
I have two independent multinomial random variables $Y_1$ and $Y_2$. I have to find the distribution of
$$X=Y_1+Y_2$$
$$Y_1 - text{Multinomial}(n_1,(p_1,p_2...p_k))$$
$$Y_2 - text{Multinomial}(n_2,(p_1,p_2...p_k))$$
I tried using the convolution to calculate the distribution but got stuck after a while
$$P(x_1,x_2..x_k) = sum_{y_1,y_2..y_n} binom{n_1}{y_1 y_2..y_k}p_1^{y_1}p_2^{y_2}..p_k^{y_k} binom{n_2}{(x_1-y_1) (x_2-y_2)..(x_k-y_k)}p_1^{x_1-y_1}p_2^{x_2-y_2}..p_k^{x_k-y_k}$$
such that $y_1+y_2+...+y_n = n_1$ and by similar reasoning we see that $x_1+x_2+...+x_n=n_1+n_2$
$$P(x_1,x_2..x_k) = p_1^{x_1}p_2^{x_2}...p_k^{x_k}sum_{y_1,y_2..y_n} binom{n_1}{y_1 y_2..y_k} binom{n_2}{(x_1-y_1) (x_2-y_2)...(x_k-y_k)}$$
$$P(x_1,x_2..x_k) = (n_1!)(n_2!) p_1^{x_1}p_2^{x_2}...p_k^{x_k}sum_{y_1,y_2..y_n} frac{1}{y_1! y_2!..y_k!} cdotfrac{1}{(x_1-y_1)! (x_2-y_2)!...(x_k-y_k)!}$$
$$P(x_1,x_2..x_k) = frac{(n_1!)(n_2!) p_1^{x_1}p_2^{x_2}...p_k^{x_k}}{x_1! x_2!..x_k!}sum_{y_1,y_2..y_n} binom{x_1}{y_1}binom{x_2}{y_2}...binom{x_k}{y_k}$$
But after this I couldn't solve it. Please help
probability statistics probability-distributions
$endgroup$
I have two independent multinomial random variables $Y_1$ and $Y_2$. I have to find the distribution of
$$X=Y_1+Y_2$$
$$Y_1 - text{Multinomial}(n_1,(p_1,p_2...p_k))$$
$$Y_2 - text{Multinomial}(n_2,(p_1,p_2...p_k))$$
I tried using the convolution to calculate the distribution but got stuck after a while
$$P(x_1,x_2..x_k) = sum_{y_1,y_2..y_n} binom{n_1}{y_1 y_2..y_k}p_1^{y_1}p_2^{y_2}..p_k^{y_k} binom{n_2}{(x_1-y_1) (x_2-y_2)..(x_k-y_k)}p_1^{x_1-y_1}p_2^{x_2-y_2}..p_k^{x_k-y_k}$$
such that $y_1+y_2+...+y_n = n_1$ and by similar reasoning we see that $x_1+x_2+...+x_n=n_1+n_2$
$$P(x_1,x_2..x_k) = p_1^{x_1}p_2^{x_2}...p_k^{x_k}sum_{y_1,y_2..y_n} binom{n_1}{y_1 y_2..y_k} binom{n_2}{(x_1-y_1) (x_2-y_2)...(x_k-y_k)}$$
$$P(x_1,x_2..x_k) = (n_1!)(n_2!) p_1^{x_1}p_2^{x_2}...p_k^{x_k}sum_{y_1,y_2..y_n} frac{1}{y_1! y_2!..y_k!} cdotfrac{1}{(x_1-y_1)! (x_2-y_2)!...(x_k-y_k)!}$$
$$P(x_1,x_2..x_k) = frac{(n_1!)(n_2!) p_1^{x_1}p_2^{x_2}...p_k^{x_k}}{x_1! x_2!..x_k!}sum_{y_1,y_2..y_n} binom{x_1}{y_1}binom{x_2}{y_2}...binom{x_k}{y_k}$$
But after this I couldn't solve it. Please help
probability statistics probability-distributions
probability statistics probability-distributions
edited Jan 17 at 9:04
Sauhard Sharma
asked Jan 17 at 5:49
Sauhard SharmaSauhard Sharma
953318
953318
$begingroup$
Are they are independent?
$endgroup$
– Henry
Jan 17 at 8:33
$begingroup$
Yeah. They are independent.
$endgroup$
– Sauhard Sharma
Jan 17 at 9:04
$begingroup$
Then, as $Y_1$ is the sum of $n_1$ independent $text{Multinomial}(1,(p_1,p_2...p_k))$ and $Y_2$ is the sum of $n_2$ independent $text{Multinomial}(1,(p_1,p_2...p_k))$, you find $Y_1+Y_2$ is the sum of $n_1+n_2$ independent $text{Multinomial}(1,(p_1,p_2...p_k))$ which is $text{Multinomial}(n_1+n_2,(p_1,p_2...p_k))$
$endgroup$
– Henry
Jan 17 at 10:19
$begingroup$
How can you say that sum of $n_1$ independent Multinomial$(1,(p_1,p_2...p_k))$ is equal to $(n_1,(p_1,p_2...p_k))$. Could you please provide any reference text for this ?
$endgroup$
– Sauhard Sharma
Jan 17 at 10:57
$begingroup$
It may depend on your definition of $text{Multinomial}(n,(p_1,p_2...p_k))$. Wikipedia says "For $n$ independent trials each of which leads to a success for exactly one of $k$ categories, with each category having a given fixed success probability, the multinomial distribution gives the probability of any particular combination of numbers of successes for the various categories" which I would have thought makes my point
$endgroup$
– Henry
Jan 17 at 11:03
add a comment |
$begingroup$
Are they are independent?
$endgroup$
– Henry
Jan 17 at 8:33
$begingroup$
Yeah. They are independent.
$endgroup$
– Sauhard Sharma
Jan 17 at 9:04
$begingroup$
Then, as $Y_1$ is the sum of $n_1$ independent $text{Multinomial}(1,(p_1,p_2...p_k))$ and $Y_2$ is the sum of $n_2$ independent $text{Multinomial}(1,(p_1,p_2...p_k))$, you find $Y_1+Y_2$ is the sum of $n_1+n_2$ independent $text{Multinomial}(1,(p_1,p_2...p_k))$ which is $text{Multinomial}(n_1+n_2,(p_1,p_2...p_k))$
$endgroup$
– Henry
Jan 17 at 10:19
$begingroup$
How can you say that sum of $n_1$ independent Multinomial$(1,(p_1,p_2...p_k))$ is equal to $(n_1,(p_1,p_2...p_k))$. Could you please provide any reference text for this ?
$endgroup$
– Sauhard Sharma
Jan 17 at 10:57
$begingroup$
It may depend on your definition of $text{Multinomial}(n,(p_1,p_2...p_k))$. Wikipedia says "For $n$ independent trials each of which leads to a success for exactly one of $k$ categories, with each category having a given fixed success probability, the multinomial distribution gives the probability of any particular combination of numbers of successes for the various categories" which I would have thought makes my point
$endgroup$
– Henry
Jan 17 at 11:03
$begingroup$
Are they are independent?
$endgroup$
– Henry
Jan 17 at 8:33
$begingroup$
Are they are independent?
$endgroup$
– Henry
Jan 17 at 8:33
$begingroup$
Yeah. They are independent.
$endgroup$
– Sauhard Sharma
Jan 17 at 9:04
$begingroup$
Yeah. They are independent.
$endgroup$
– Sauhard Sharma
Jan 17 at 9:04
$begingroup$
Then, as $Y_1$ is the sum of $n_1$ independent $text{Multinomial}(1,(p_1,p_2...p_k))$ and $Y_2$ is the sum of $n_2$ independent $text{Multinomial}(1,(p_1,p_2...p_k))$, you find $Y_1+Y_2$ is the sum of $n_1+n_2$ independent $text{Multinomial}(1,(p_1,p_2...p_k))$ which is $text{Multinomial}(n_1+n_2,(p_1,p_2...p_k))$
$endgroup$
– Henry
Jan 17 at 10:19
$begingroup$
Then, as $Y_1$ is the sum of $n_1$ independent $text{Multinomial}(1,(p_1,p_2...p_k))$ and $Y_2$ is the sum of $n_2$ independent $text{Multinomial}(1,(p_1,p_2...p_k))$, you find $Y_1+Y_2$ is the sum of $n_1+n_2$ independent $text{Multinomial}(1,(p_1,p_2...p_k))$ which is $text{Multinomial}(n_1+n_2,(p_1,p_2...p_k))$
$endgroup$
– Henry
Jan 17 at 10:19
$begingroup$
How can you say that sum of $n_1$ independent Multinomial$(1,(p_1,p_2...p_k))$ is equal to $(n_1,(p_1,p_2...p_k))$. Could you please provide any reference text for this ?
$endgroup$
– Sauhard Sharma
Jan 17 at 10:57
$begingroup$
How can you say that sum of $n_1$ independent Multinomial$(1,(p_1,p_2...p_k))$ is equal to $(n_1,(p_1,p_2...p_k))$. Could you please provide any reference text for this ?
$endgroup$
– Sauhard Sharma
Jan 17 at 10:57
$begingroup$
It may depend on your definition of $text{Multinomial}(n,(p_1,p_2...p_k))$. Wikipedia says "For $n$ independent trials each of which leads to a success for exactly one of $k$ categories, with each category having a given fixed success probability, the multinomial distribution gives the probability of any particular combination of numbers of successes for the various categories" which I would have thought makes my point
$endgroup$
– Henry
Jan 17 at 11:03
$begingroup$
It may depend on your definition of $text{Multinomial}(n,(p_1,p_2...p_k))$. Wikipedia says "For $n$ independent trials each of which leads to a success for exactly one of $k$ categories, with each category having a given fixed success probability, the multinomial distribution gives the probability of any particular combination of numbers of successes for the various categories" which I would have thought makes my point
$endgroup$
– Henry
Jan 17 at 11:03
add a comment |
1 Answer
1
active
oldest
votes
$begingroup$
It would be easier to use characteristic functions.
begin{equation}
CF_{text{Multinomial}(n,(p_1,...,p_k))}(t_1,...,t_k) = bigg(sum_{j=1}^k p_je^{it_j}bigg)^n
end{equation}
As the CF of a sum of random variables is a product of their CFs, it is easy to spot that
begin{equation}
X sim text{Multinomial}(n_1+n_2,(p_1,p_2...p_k))
end{equation}
as the equality of CFs induces equality of distributions and
begin{equation}
CF_X = CF_{Y_1+Y_2} = CF_{Y_1}CF_{Y_2} = bigg(sum_{j=1}^k p_je^{it_j}bigg)^{n_1}bigg(sum_{j=1}^k p_je^{it_j}bigg)^{n_2} = bigg(sum_{j=1}^k p_je^{it_j}bigg)^{n_1 + n_2}= CF_{text{Multinomial}(n_1 + n_2,(p_1,...,p_k))}(t_1,...,t_k).
end{equation}
$endgroup$
$begingroup$
Or we could use moment-generating functions, to avoid complex numbers. Or even better still, we could use probability-generating functions. That has the added benefit of letting us read off the pmf directly afterwards if we'd like to.
$endgroup$
– J.G.
Jan 17 at 7:45
$begingroup$
@J.G. Could you please do that and show me ?
$endgroup$
– Sauhard Sharma
Jan 17 at 9:09
1
$begingroup$
@J.G. What is bad about complex numbers? :-)
$endgroup$
– Math-fun
Jan 17 at 9:21
$begingroup$
@SauhardSharma Just replace $e^{it_j}$ with $t_j$.
$endgroup$
– J.G.
Jan 17 at 12:16
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3076634%2fsum-of-two-multinomial-random-variables%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
It would be easier to use characteristic functions.
begin{equation}
CF_{text{Multinomial}(n,(p_1,...,p_k))}(t_1,...,t_k) = bigg(sum_{j=1}^k p_je^{it_j}bigg)^n
end{equation}
As the CF of a sum of random variables is a product of their CFs, it is easy to spot that
begin{equation}
X sim text{Multinomial}(n_1+n_2,(p_1,p_2...p_k))
end{equation}
as the equality of CFs induces equality of distributions and
begin{equation}
CF_X = CF_{Y_1+Y_2} = CF_{Y_1}CF_{Y_2} = bigg(sum_{j=1}^k p_je^{it_j}bigg)^{n_1}bigg(sum_{j=1}^k p_je^{it_j}bigg)^{n_2} = bigg(sum_{j=1}^k p_je^{it_j}bigg)^{n_1 + n_2}= CF_{text{Multinomial}(n_1 + n_2,(p_1,...,p_k))}(t_1,...,t_k).
end{equation}
$endgroup$
$begingroup$
Or we could use moment-generating functions, to avoid complex numbers. Or even better still, we could use probability-generating functions. That has the added benefit of letting us read off the pmf directly afterwards if we'd like to.
$endgroup$
– J.G.
Jan 17 at 7:45
$begingroup$
@J.G. Could you please do that and show me ?
$endgroup$
– Sauhard Sharma
Jan 17 at 9:09
1
$begingroup$
@J.G. What is bad about complex numbers? :-)
$endgroup$
– Math-fun
Jan 17 at 9:21
$begingroup$
@SauhardSharma Just replace $e^{it_j}$ with $t_j$.
$endgroup$
– J.G.
Jan 17 at 12:16
add a comment |
$begingroup$
It would be easier to use characteristic functions.
begin{equation}
CF_{text{Multinomial}(n,(p_1,...,p_k))}(t_1,...,t_k) = bigg(sum_{j=1}^k p_je^{it_j}bigg)^n
end{equation}
As the CF of a sum of random variables is a product of their CFs, it is easy to spot that
begin{equation}
X sim text{Multinomial}(n_1+n_2,(p_1,p_2...p_k))
end{equation}
as the equality of CFs induces equality of distributions and
begin{equation}
CF_X = CF_{Y_1+Y_2} = CF_{Y_1}CF_{Y_2} = bigg(sum_{j=1}^k p_je^{it_j}bigg)^{n_1}bigg(sum_{j=1}^k p_je^{it_j}bigg)^{n_2} = bigg(sum_{j=1}^k p_je^{it_j}bigg)^{n_1 + n_2}= CF_{text{Multinomial}(n_1 + n_2,(p_1,...,p_k))}(t_1,...,t_k).
end{equation}
$endgroup$
$begingroup$
Or we could use moment-generating functions, to avoid complex numbers. Or even better still, we could use probability-generating functions. That has the added benefit of letting us read off the pmf directly afterwards if we'd like to.
$endgroup$
– J.G.
Jan 17 at 7:45
$begingroup$
@J.G. Could you please do that and show me ?
$endgroup$
– Sauhard Sharma
Jan 17 at 9:09
1
$begingroup$
@J.G. What is bad about complex numbers? :-)
$endgroup$
– Math-fun
Jan 17 at 9:21
$begingroup$
@SauhardSharma Just replace $e^{it_j}$ with $t_j$.
$endgroup$
– J.G.
Jan 17 at 12:16
add a comment |
$begingroup$
It would be easier to use characteristic functions.
begin{equation}
CF_{text{Multinomial}(n,(p_1,...,p_k))}(t_1,...,t_k) = bigg(sum_{j=1}^k p_je^{it_j}bigg)^n
end{equation}
As the CF of a sum of random variables is a product of their CFs, it is easy to spot that
begin{equation}
X sim text{Multinomial}(n_1+n_2,(p_1,p_2...p_k))
end{equation}
as the equality of CFs induces equality of distributions and
begin{equation}
CF_X = CF_{Y_1+Y_2} = CF_{Y_1}CF_{Y_2} = bigg(sum_{j=1}^k p_je^{it_j}bigg)^{n_1}bigg(sum_{j=1}^k p_je^{it_j}bigg)^{n_2} = bigg(sum_{j=1}^k p_je^{it_j}bigg)^{n_1 + n_2}= CF_{text{Multinomial}(n_1 + n_2,(p_1,...,p_k))}(t_1,...,t_k).
end{equation}
$endgroup$
It would be easier to use characteristic functions.
begin{equation}
CF_{text{Multinomial}(n,(p_1,...,p_k))}(t_1,...,t_k) = bigg(sum_{j=1}^k p_je^{it_j}bigg)^n
end{equation}
As the CF of a sum of random variables is a product of their CFs, it is easy to spot that
begin{equation}
X sim text{Multinomial}(n_1+n_2,(p_1,p_2...p_k))
end{equation}
as the equality of CFs induces equality of distributions and
begin{equation}
CF_X = CF_{Y_1+Y_2} = CF_{Y_1}CF_{Y_2} = bigg(sum_{j=1}^k p_je^{it_j}bigg)^{n_1}bigg(sum_{j=1}^k p_je^{it_j}bigg)^{n_2} = bigg(sum_{j=1}^k p_je^{it_j}bigg)^{n_1 + n_2}= CF_{text{Multinomial}(n_1 + n_2,(p_1,...,p_k))}(t_1,...,t_k).
end{equation}
edited Jan 18 at 6:45
answered Jan 17 at 7:07
vermatorvermator
1298
1298
$begingroup$
Or we could use moment-generating functions, to avoid complex numbers. Or even better still, we could use probability-generating functions. That has the added benefit of letting us read off the pmf directly afterwards if we'd like to.
$endgroup$
– J.G.
Jan 17 at 7:45
$begingroup$
@J.G. Could you please do that and show me ?
$endgroup$
– Sauhard Sharma
Jan 17 at 9:09
1
$begingroup$
@J.G. What is bad about complex numbers? :-)
$endgroup$
– Math-fun
Jan 17 at 9:21
$begingroup$
@SauhardSharma Just replace $e^{it_j}$ with $t_j$.
$endgroup$
– J.G.
Jan 17 at 12:16
add a comment |
$begingroup$
Or we could use moment-generating functions, to avoid complex numbers. Or even better still, we could use probability-generating functions. That has the added benefit of letting us read off the pmf directly afterwards if we'd like to.
$endgroup$
– J.G.
Jan 17 at 7:45
$begingroup$
@J.G. Could you please do that and show me ?
$endgroup$
– Sauhard Sharma
Jan 17 at 9:09
1
$begingroup$
@J.G. What is bad about complex numbers? :-)
$endgroup$
– Math-fun
Jan 17 at 9:21
$begingroup$
@SauhardSharma Just replace $e^{it_j}$ with $t_j$.
$endgroup$
– J.G.
Jan 17 at 12:16
$begingroup$
Or we could use moment-generating functions, to avoid complex numbers. Or even better still, we could use probability-generating functions. That has the added benefit of letting us read off the pmf directly afterwards if we'd like to.
$endgroup$
– J.G.
Jan 17 at 7:45
$begingroup$
Or we could use moment-generating functions, to avoid complex numbers. Or even better still, we could use probability-generating functions. That has the added benefit of letting us read off the pmf directly afterwards if we'd like to.
$endgroup$
– J.G.
Jan 17 at 7:45
$begingroup$
@J.G. Could you please do that and show me ?
$endgroup$
– Sauhard Sharma
Jan 17 at 9:09
$begingroup$
@J.G. Could you please do that and show me ?
$endgroup$
– Sauhard Sharma
Jan 17 at 9:09
1
1
$begingroup$
@J.G. What is bad about complex numbers? :-)
$endgroup$
– Math-fun
Jan 17 at 9:21
$begingroup$
@J.G. What is bad about complex numbers? :-)
$endgroup$
– Math-fun
Jan 17 at 9:21
$begingroup$
@SauhardSharma Just replace $e^{it_j}$ with $t_j$.
$endgroup$
– J.G.
Jan 17 at 12:16
$begingroup$
@SauhardSharma Just replace $e^{it_j}$ with $t_j$.
$endgroup$
– J.G.
Jan 17 at 12:16
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3076634%2fsum-of-two-multinomial-random-variables%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
$begingroup$
Are they are independent?
$endgroup$
– Henry
Jan 17 at 8:33
$begingroup$
Yeah. They are independent.
$endgroup$
– Sauhard Sharma
Jan 17 at 9:04
$begingroup$
Then, as $Y_1$ is the sum of $n_1$ independent $text{Multinomial}(1,(p_1,p_2...p_k))$ and $Y_2$ is the sum of $n_2$ independent $text{Multinomial}(1,(p_1,p_2...p_k))$, you find $Y_1+Y_2$ is the sum of $n_1+n_2$ independent $text{Multinomial}(1,(p_1,p_2...p_k))$ which is $text{Multinomial}(n_1+n_2,(p_1,p_2...p_k))$
$endgroup$
– Henry
Jan 17 at 10:19
$begingroup$
How can you say that sum of $n_1$ independent Multinomial$(1,(p_1,p_2...p_k))$ is equal to $(n_1,(p_1,p_2...p_k))$. Could you please provide any reference text for this ?
$endgroup$
– Sauhard Sharma
Jan 17 at 10:57
$begingroup$
It may depend on your definition of $text{Multinomial}(n,(p_1,p_2...p_k))$. Wikipedia says "For $n$ independent trials each of which leads to a success for exactly one of $k$ categories, with each category having a given fixed success probability, the multinomial distribution gives the probability of any particular combination of numbers of successes for the various categories" which I would have thought makes my point
$endgroup$
– Henry
Jan 17 at 11:03