Expectation and variance of the number of elements of a random non-empty set selected from a finite power set
$begingroup$
Let $S$ denote a finite set of cardinality $|S| = N$. Select randomly a non-empty subset of $S$. Let $X$ indicate the number of items belonging to this subset.
(a) Describe the probability mass function of $X$ and compute $textbf{E}(X)$ and $textbf{Var}(X)$.
(b) Prove that $displaystylelim_{Nrightarrowinfty}frac{textbf{E}(X)}{N} = frac{1}{2}$ and $displaystylelim_{Nrightarrowinfty}frac{textbf{V}(X)}{N} = frac{1}{4}$.
MY ATTEMPT
(a) In the first place, we first notice there are $C(N,x)$ subsets which has $x$ elements, where $1leq xleq N$. Therefore we conclude that
begin{align*}
f_{X}(x) = frac{1}{2^{N} - 1}{Nchoose x}
end{align*}
From then on we are able to determine $textbf{E}(X)$ and $textbf{Var}(X)$. Precisely speaking, we have
begin{align*}
sum_{x=1}^{N}{Nchoose x}x & = sum_{x=1}^{N}frac{N!}{(x-1)!(N-x)!} = sum_{x=0}^{N-1}frac{N!}{x!(N-x-1)!}\
& = Nsum_{x=0}^{N-1}frac{(N-1)!}{x!(N-x-1)!} = Nsum_{x=0}^{N-1}{N-1choose x} = N2^{N-1}
end{align*}
according to the binomial theorem. From whence we conclude that
$$textbf{E}(X) = frac{N2^{N-1}}{2^{N}-1}$$
Further, we do also have the following identity
begin{align*}
sum_{x=1}^{N}{Nchoose x}x^{2} & = sum_{x=1}^{N}left[frac{N!}{(x-1)!(N-x)!}right]x = sum_{x=0}^{N-1}left[frac{N!}{x!(N-x-1)!}right](x+1)\
& = sum_{x=1}^{N-1}frac{N!}{(x-1)!(N-x-1)!} + sum_{x=0}^{N-1}frac{N!}{x!(N-x-1)!}\
& = N(N-1)sum_{x=0}^{N-2}frac{(N-2)!}{x!(N-x-2)!} + Nsum_{x=0}^{N-1}frac{(N-1)!}{x!(N-x-1)!}\
& = N(N-1)sum_{x=0}^{N-2}{Nchoose x} + Nsum_{x=0}^{N-1}{N-1choose x} = N(N-1)2^{N-2} + N2^{N-1}
end{align*}
Therefore
$$textbf{E}(X^{2}) = frac{N(N-1)2^{N-2} + N2^{N-1}}{2^{N}-1}$$
Finally, we have the variance:
begin{align*}
frac{textbf{Var}(X)}{N} & = frac{textbf{E}(X^{2}) - textbf{E}(X)^{2}}{N}
end{align*}
(b) As requested, we obtain the desired result
begin{align*}
lim_{Nrightarrowinfty}frac{textbf{E}(X)}{N} = lim_{Nrightarrowinfty}frac{2^{N-1}}{2^{N}-1} = frac{1}{2}
end{align*}
However, the same does not apply to the variance when I make use of the results obtained. Am I committing some conceptual mistake? I would be equally grateful if someone provided me a quicker or smarter way to tackle this problem.
EDIT
I edited my answer as suggested by Did. Nonetheless, I am still thinking there is something wrong with my solution. Could someone double-check my calculations? Thanks in advance.
probability-theory proof-verification probability-distributions alternative-proof binomial-distribution
$endgroup$
add a comment |
$begingroup$
Let $S$ denote a finite set of cardinality $|S| = N$. Select randomly a non-empty subset of $S$. Let $X$ indicate the number of items belonging to this subset.
(a) Describe the probability mass function of $X$ and compute $textbf{E}(X)$ and $textbf{Var}(X)$.
(b) Prove that $displaystylelim_{Nrightarrowinfty}frac{textbf{E}(X)}{N} = frac{1}{2}$ and $displaystylelim_{Nrightarrowinfty}frac{textbf{V}(X)}{N} = frac{1}{4}$.
MY ATTEMPT
(a) In the first place, we first notice there are $C(N,x)$ subsets which has $x$ elements, where $1leq xleq N$. Therefore we conclude that
begin{align*}
f_{X}(x) = frac{1}{2^{N} - 1}{Nchoose x}
end{align*}
From then on we are able to determine $textbf{E}(X)$ and $textbf{Var}(X)$. Precisely speaking, we have
begin{align*}
sum_{x=1}^{N}{Nchoose x}x & = sum_{x=1}^{N}frac{N!}{(x-1)!(N-x)!} = sum_{x=0}^{N-1}frac{N!}{x!(N-x-1)!}\
& = Nsum_{x=0}^{N-1}frac{(N-1)!}{x!(N-x-1)!} = Nsum_{x=0}^{N-1}{N-1choose x} = N2^{N-1}
end{align*}
according to the binomial theorem. From whence we conclude that
$$textbf{E}(X) = frac{N2^{N-1}}{2^{N}-1}$$
Further, we do also have the following identity
begin{align*}
sum_{x=1}^{N}{Nchoose x}x^{2} & = sum_{x=1}^{N}left[frac{N!}{(x-1)!(N-x)!}right]x = sum_{x=0}^{N-1}left[frac{N!}{x!(N-x-1)!}right](x+1)\
& = sum_{x=1}^{N-1}frac{N!}{(x-1)!(N-x-1)!} + sum_{x=0}^{N-1}frac{N!}{x!(N-x-1)!}\
& = N(N-1)sum_{x=0}^{N-2}frac{(N-2)!}{x!(N-x-2)!} + Nsum_{x=0}^{N-1}frac{(N-1)!}{x!(N-x-1)!}\
& = N(N-1)sum_{x=0}^{N-2}{Nchoose x} + Nsum_{x=0}^{N-1}{N-1choose x} = N(N-1)2^{N-2} + N2^{N-1}
end{align*}
Therefore
$$textbf{E}(X^{2}) = frac{N(N-1)2^{N-2} + N2^{N-1}}{2^{N}-1}$$
Finally, we have the variance:
begin{align*}
frac{textbf{Var}(X)}{N} & = frac{textbf{E}(X^{2}) - textbf{E}(X)^{2}}{N}
end{align*}
(b) As requested, we obtain the desired result
begin{align*}
lim_{Nrightarrowinfty}frac{textbf{E}(X)}{N} = lim_{Nrightarrowinfty}frac{2^{N-1}}{2^{N}-1} = frac{1}{2}
end{align*}
However, the same does not apply to the variance when I make use of the results obtained. Am I committing some conceptual mistake? I would be equally grateful if someone provided me a quicker or smarter way to tackle this problem.
EDIT
I edited my answer as suggested by Did. Nonetheless, I am still thinking there is something wrong with my solution. Could someone double-check my calculations? Thanks in advance.
probability-theory proof-verification probability-distributions alternative-proof binomial-distribution
$endgroup$
$begingroup$
Your first mistake (I did not look further) is when you write $$sum_{x=1}^{N-1}frac{N!}{(x-1)!(N-x-1)!} = sum_{x=0}^{N-2}frac{N!}{x!(N-x)!}$$ Yes one should replace $x$ by $x-1$ but then $(N-x-1)!$ becomes $(N-x-2)!$, not $(N-x)!$.
$endgroup$
– Did
Jan 22 at 19:54
1
$begingroup$
For an alternative proof, use the fact that $X$ equals in distribution $Y$ conditioned on $Yne0$, where $Y$ is binomial $(N,frac12)$. Thus, $$E(X)=E(Ymid Yne0)=frac{E(Y)}{1-p}qquad E(X^2)=E(Y^2mid Yne0)=frac{E(Y^2)}{1-p}$$ where $$p=P(Y=0)=2^{-N}$$ and $E(Y)$ and $E(Y^2)$ should be in your bag of known facts. Likewise, for every positive $k$, $$E(X^k)=frac{E(Y^k)}{1-p}$$
$endgroup$
– Did
Jan 22 at 19:57
$begingroup$
Did, can you add your comment as an answer so I can better understand it and vote it up?
$endgroup$
– user1337
Jan 26 at 17:42
$begingroup$
"However, the same does not apply to the variance when I make use of the results obtained" Again, we cannot know what and if you did something wrong since you neither show what you did nor what you got for the variance. The partial computations you do show indeed lead to $mathrm{Var}(X_N)sim N/4$.
$endgroup$
– Did
Jan 26 at 18:49
add a comment |
$begingroup$
Let $S$ denote a finite set of cardinality $|S| = N$. Select randomly a non-empty subset of $S$. Let $X$ indicate the number of items belonging to this subset.
(a) Describe the probability mass function of $X$ and compute $textbf{E}(X)$ and $textbf{Var}(X)$.
(b) Prove that $displaystylelim_{Nrightarrowinfty}frac{textbf{E}(X)}{N} = frac{1}{2}$ and $displaystylelim_{Nrightarrowinfty}frac{textbf{V}(X)}{N} = frac{1}{4}$.
MY ATTEMPT
(a) In the first place, we first notice there are $C(N,x)$ subsets which has $x$ elements, where $1leq xleq N$. Therefore we conclude that
begin{align*}
f_{X}(x) = frac{1}{2^{N} - 1}{Nchoose x}
end{align*}
From then on we are able to determine $textbf{E}(X)$ and $textbf{Var}(X)$. Precisely speaking, we have
begin{align*}
sum_{x=1}^{N}{Nchoose x}x & = sum_{x=1}^{N}frac{N!}{(x-1)!(N-x)!} = sum_{x=0}^{N-1}frac{N!}{x!(N-x-1)!}\
& = Nsum_{x=0}^{N-1}frac{(N-1)!}{x!(N-x-1)!} = Nsum_{x=0}^{N-1}{N-1choose x} = N2^{N-1}
end{align*}
according to the binomial theorem. From whence we conclude that
$$textbf{E}(X) = frac{N2^{N-1}}{2^{N}-1}$$
Further, we do also have the following identity
begin{align*}
sum_{x=1}^{N}{Nchoose x}x^{2} & = sum_{x=1}^{N}left[frac{N!}{(x-1)!(N-x)!}right]x = sum_{x=0}^{N-1}left[frac{N!}{x!(N-x-1)!}right](x+1)\
& = sum_{x=1}^{N-1}frac{N!}{(x-1)!(N-x-1)!} + sum_{x=0}^{N-1}frac{N!}{x!(N-x-1)!}\
& = N(N-1)sum_{x=0}^{N-2}frac{(N-2)!}{x!(N-x-2)!} + Nsum_{x=0}^{N-1}frac{(N-1)!}{x!(N-x-1)!}\
& = N(N-1)sum_{x=0}^{N-2}{Nchoose x} + Nsum_{x=0}^{N-1}{N-1choose x} = N(N-1)2^{N-2} + N2^{N-1}
end{align*}
Therefore
$$textbf{E}(X^{2}) = frac{N(N-1)2^{N-2} + N2^{N-1}}{2^{N}-1}$$
Finally, we have the variance:
begin{align*}
frac{textbf{Var}(X)}{N} & = frac{textbf{E}(X^{2}) - textbf{E}(X)^{2}}{N}
end{align*}
(b) As requested, we obtain the desired result
begin{align*}
lim_{Nrightarrowinfty}frac{textbf{E}(X)}{N} = lim_{Nrightarrowinfty}frac{2^{N-1}}{2^{N}-1} = frac{1}{2}
end{align*}
However, the same does not apply to the variance when I make use of the results obtained. Am I committing some conceptual mistake? I would be equally grateful if someone provided me a quicker or smarter way to tackle this problem.
EDIT
I edited my answer as suggested by Did. Nonetheless, I am still thinking there is something wrong with my solution. Could someone double-check my calculations? Thanks in advance.
probability-theory proof-verification probability-distributions alternative-proof binomial-distribution
$endgroup$
Let $S$ denote a finite set of cardinality $|S| = N$. Select randomly a non-empty subset of $S$. Let $X$ indicate the number of items belonging to this subset.
(a) Describe the probability mass function of $X$ and compute $textbf{E}(X)$ and $textbf{Var}(X)$.
(b) Prove that $displaystylelim_{Nrightarrowinfty}frac{textbf{E}(X)}{N} = frac{1}{2}$ and $displaystylelim_{Nrightarrowinfty}frac{textbf{V}(X)}{N} = frac{1}{4}$.
MY ATTEMPT
(a) In the first place, we first notice there are $C(N,x)$ subsets which has $x$ elements, where $1leq xleq N$. Therefore we conclude that
begin{align*}
f_{X}(x) = frac{1}{2^{N} - 1}{Nchoose x}
end{align*}
From then on we are able to determine $textbf{E}(X)$ and $textbf{Var}(X)$. Precisely speaking, we have
begin{align*}
sum_{x=1}^{N}{Nchoose x}x & = sum_{x=1}^{N}frac{N!}{(x-1)!(N-x)!} = sum_{x=0}^{N-1}frac{N!}{x!(N-x-1)!}\
& = Nsum_{x=0}^{N-1}frac{(N-1)!}{x!(N-x-1)!} = Nsum_{x=0}^{N-1}{N-1choose x} = N2^{N-1}
end{align*}
according to the binomial theorem. From whence we conclude that
$$textbf{E}(X) = frac{N2^{N-1}}{2^{N}-1}$$
Further, we do also have the following identity
begin{align*}
sum_{x=1}^{N}{Nchoose x}x^{2} & = sum_{x=1}^{N}left[frac{N!}{(x-1)!(N-x)!}right]x = sum_{x=0}^{N-1}left[frac{N!}{x!(N-x-1)!}right](x+1)\
& = sum_{x=1}^{N-1}frac{N!}{(x-1)!(N-x-1)!} + sum_{x=0}^{N-1}frac{N!}{x!(N-x-1)!}\
& = N(N-1)sum_{x=0}^{N-2}frac{(N-2)!}{x!(N-x-2)!} + Nsum_{x=0}^{N-1}frac{(N-1)!}{x!(N-x-1)!}\
& = N(N-1)sum_{x=0}^{N-2}{Nchoose x} + Nsum_{x=0}^{N-1}{N-1choose x} = N(N-1)2^{N-2} + N2^{N-1}
end{align*}
Therefore
$$textbf{E}(X^{2}) = frac{N(N-1)2^{N-2} + N2^{N-1}}{2^{N}-1}$$
Finally, we have the variance:
begin{align*}
frac{textbf{Var}(X)}{N} & = frac{textbf{E}(X^{2}) - textbf{E}(X)^{2}}{N}
end{align*}
(b) As requested, we obtain the desired result
begin{align*}
lim_{Nrightarrowinfty}frac{textbf{E}(X)}{N} = lim_{Nrightarrowinfty}frac{2^{N-1}}{2^{N}-1} = frac{1}{2}
end{align*}
However, the same does not apply to the variance when I make use of the results obtained. Am I committing some conceptual mistake? I would be equally grateful if someone provided me a quicker or smarter way to tackle this problem.
EDIT
I edited my answer as suggested by Did. Nonetheless, I am still thinking there is something wrong with my solution. Could someone double-check my calculations? Thanks in advance.
probability-theory proof-verification probability-distributions alternative-proof binomial-distribution
probability-theory proof-verification probability-distributions alternative-proof binomial-distribution
edited Jan 24 at 1:55
user1337
asked Jan 20 at 3:12
user1337user1337
46110
46110
$begingroup$
Your first mistake (I did not look further) is when you write $$sum_{x=1}^{N-1}frac{N!}{(x-1)!(N-x-1)!} = sum_{x=0}^{N-2}frac{N!}{x!(N-x)!}$$ Yes one should replace $x$ by $x-1$ but then $(N-x-1)!$ becomes $(N-x-2)!$, not $(N-x)!$.
$endgroup$
– Did
Jan 22 at 19:54
1
$begingroup$
For an alternative proof, use the fact that $X$ equals in distribution $Y$ conditioned on $Yne0$, where $Y$ is binomial $(N,frac12)$. Thus, $$E(X)=E(Ymid Yne0)=frac{E(Y)}{1-p}qquad E(X^2)=E(Y^2mid Yne0)=frac{E(Y^2)}{1-p}$$ where $$p=P(Y=0)=2^{-N}$$ and $E(Y)$ and $E(Y^2)$ should be in your bag of known facts. Likewise, for every positive $k$, $$E(X^k)=frac{E(Y^k)}{1-p}$$
$endgroup$
– Did
Jan 22 at 19:57
$begingroup$
Did, can you add your comment as an answer so I can better understand it and vote it up?
$endgroup$
– user1337
Jan 26 at 17:42
$begingroup$
"However, the same does not apply to the variance when I make use of the results obtained" Again, we cannot know what and if you did something wrong since you neither show what you did nor what you got for the variance. The partial computations you do show indeed lead to $mathrm{Var}(X_N)sim N/4$.
$endgroup$
– Did
Jan 26 at 18:49
add a comment |
$begingroup$
Your first mistake (I did not look further) is when you write $$sum_{x=1}^{N-1}frac{N!}{(x-1)!(N-x-1)!} = sum_{x=0}^{N-2}frac{N!}{x!(N-x)!}$$ Yes one should replace $x$ by $x-1$ but then $(N-x-1)!$ becomes $(N-x-2)!$, not $(N-x)!$.
$endgroup$
– Did
Jan 22 at 19:54
1
$begingroup$
For an alternative proof, use the fact that $X$ equals in distribution $Y$ conditioned on $Yne0$, where $Y$ is binomial $(N,frac12)$. Thus, $$E(X)=E(Ymid Yne0)=frac{E(Y)}{1-p}qquad E(X^2)=E(Y^2mid Yne0)=frac{E(Y^2)}{1-p}$$ where $$p=P(Y=0)=2^{-N}$$ and $E(Y)$ and $E(Y^2)$ should be in your bag of known facts. Likewise, for every positive $k$, $$E(X^k)=frac{E(Y^k)}{1-p}$$
$endgroup$
– Did
Jan 22 at 19:57
$begingroup$
Did, can you add your comment as an answer so I can better understand it and vote it up?
$endgroup$
– user1337
Jan 26 at 17:42
$begingroup$
"However, the same does not apply to the variance when I make use of the results obtained" Again, we cannot know what and if you did something wrong since you neither show what you did nor what you got for the variance. The partial computations you do show indeed lead to $mathrm{Var}(X_N)sim N/4$.
$endgroup$
– Did
Jan 26 at 18:49
$begingroup$
Your first mistake (I did not look further) is when you write $$sum_{x=1}^{N-1}frac{N!}{(x-1)!(N-x-1)!} = sum_{x=0}^{N-2}frac{N!}{x!(N-x)!}$$ Yes one should replace $x$ by $x-1$ but then $(N-x-1)!$ becomes $(N-x-2)!$, not $(N-x)!$.
$endgroup$
– Did
Jan 22 at 19:54
$begingroup$
Your first mistake (I did not look further) is when you write $$sum_{x=1}^{N-1}frac{N!}{(x-1)!(N-x-1)!} = sum_{x=0}^{N-2}frac{N!}{x!(N-x)!}$$ Yes one should replace $x$ by $x-1$ but then $(N-x-1)!$ becomes $(N-x-2)!$, not $(N-x)!$.
$endgroup$
– Did
Jan 22 at 19:54
1
1
$begingroup$
For an alternative proof, use the fact that $X$ equals in distribution $Y$ conditioned on $Yne0$, where $Y$ is binomial $(N,frac12)$. Thus, $$E(X)=E(Ymid Yne0)=frac{E(Y)}{1-p}qquad E(X^2)=E(Y^2mid Yne0)=frac{E(Y^2)}{1-p}$$ where $$p=P(Y=0)=2^{-N}$$ and $E(Y)$ and $E(Y^2)$ should be in your bag of known facts. Likewise, for every positive $k$, $$E(X^k)=frac{E(Y^k)}{1-p}$$
$endgroup$
– Did
Jan 22 at 19:57
$begingroup$
For an alternative proof, use the fact that $X$ equals in distribution $Y$ conditioned on $Yne0$, where $Y$ is binomial $(N,frac12)$. Thus, $$E(X)=E(Ymid Yne0)=frac{E(Y)}{1-p}qquad E(X^2)=E(Y^2mid Yne0)=frac{E(Y^2)}{1-p}$$ where $$p=P(Y=0)=2^{-N}$$ and $E(Y)$ and $E(Y^2)$ should be in your bag of known facts. Likewise, for every positive $k$, $$E(X^k)=frac{E(Y^k)}{1-p}$$
$endgroup$
– Did
Jan 22 at 19:57
$begingroup$
Did, can you add your comment as an answer so I can better understand it and vote it up?
$endgroup$
– user1337
Jan 26 at 17:42
$begingroup$
Did, can you add your comment as an answer so I can better understand it and vote it up?
$endgroup$
– user1337
Jan 26 at 17:42
$begingroup$
"However, the same does not apply to the variance when I make use of the results obtained" Again, we cannot know what and if you did something wrong since you neither show what you did nor what you got for the variance. The partial computations you do show indeed lead to $mathrm{Var}(X_N)sim N/4$.
$endgroup$
– Did
Jan 26 at 18:49
$begingroup$
"However, the same does not apply to the variance when I make use of the results obtained" Again, we cannot know what and if you did something wrong since you neither show what you did nor what you got for the variance. The partial computations you do show indeed lead to $mathrm{Var}(X_N)sim N/4$.
$endgroup$
– Did
Jan 26 at 18:49
add a comment |
1 Answer
1
active
oldest
votes
$begingroup$
Seem probably simpler if you look at it this way: Say $S$ has $n$ elements. If $Y$ is the number of elements of a randomly selected element of the power set, including the empty set as a possibility, then $Y$ has the same distribution as $X_1+dots+X_n$, where the $X_j$ are iid with $P(X_1=0)=P(X_1=1)=1/2$.
$endgroup$
$begingroup$
Ok, what's the problem with that? The OP said " I would be equally grateful if someone provided me a quicker or smarter way to tackle this problem."
$endgroup$
– David C. Ullrich
Jan 23 at 15:44
$begingroup$
Well, I cannot speak for others but the fact is that you do not tackle the problem asked since your Y is not distributed like the X in the question.
$endgroup$
– Did
Jan 26 at 18:50
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3080126%2fexpectation-and-variance-of-the-number-of-elements-of-a-random-non-empty-set-sel%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
Seem probably simpler if you look at it this way: Say $S$ has $n$ elements. If $Y$ is the number of elements of a randomly selected element of the power set, including the empty set as a possibility, then $Y$ has the same distribution as $X_1+dots+X_n$, where the $X_j$ are iid with $P(X_1=0)=P(X_1=1)=1/2$.
$endgroup$
$begingroup$
Ok, what's the problem with that? The OP said " I would be equally grateful if someone provided me a quicker or smarter way to tackle this problem."
$endgroup$
– David C. Ullrich
Jan 23 at 15:44
$begingroup$
Well, I cannot speak for others but the fact is that you do not tackle the problem asked since your Y is not distributed like the X in the question.
$endgroup$
– Did
Jan 26 at 18:50
add a comment |
$begingroup$
Seem probably simpler if you look at it this way: Say $S$ has $n$ elements. If $Y$ is the number of elements of a randomly selected element of the power set, including the empty set as a possibility, then $Y$ has the same distribution as $X_1+dots+X_n$, where the $X_j$ are iid with $P(X_1=0)=P(X_1=1)=1/2$.
$endgroup$
$begingroup$
Ok, what's the problem with that? The OP said " I would be equally grateful if someone provided me a quicker or smarter way to tackle this problem."
$endgroup$
– David C. Ullrich
Jan 23 at 15:44
$begingroup$
Well, I cannot speak for others but the fact is that you do not tackle the problem asked since your Y is not distributed like the X in the question.
$endgroup$
– Did
Jan 26 at 18:50
add a comment |
$begingroup$
Seem probably simpler if you look at it this way: Say $S$ has $n$ elements. If $Y$ is the number of elements of a randomly selected element of the power set, including the empty set as a possibility, then $Y$ has the same distribution as $X_1+dots+X_n$, where the $X_j$ are iid with $P(X_1=0)=P(X_1=1)=1/2$.
$endgroup$
Seem probably simpler if you look at it this way: Say $S$ has $n$ elements. If $Y$ is the number of elements of a randomly selected element of the power set, including the empty set as a possibility, then $Y$ has the same distribution as $X_1+dots+X_n$, where the $X_j$ are iid with $P(X_1=0)=P(X_1=1)=1/2$.
answered Jan 20 at 3:26
David C. UllrichDavid C. Ullrich
61.1k43994
61.1k43994
$begingroup$
Ok, what's the problem with that? The OP said " I would be equally grateful if someone provided me a quicker or smarter way to tackle this problem."
$endgroup$
– David C. Ullrich
Jan 23 at 15:44
$begingroup$
Well, I cannot speak for others but the fact is that you do not tackle the problem asked since your Y is not distributed like the X in the question.
$endgroup$
– Did
Jan 26 at 18:50
add a comment |
$begingroup$
Ok, what's the problem with that? The OP said " I would be equally grateful if someone provided me a quicker or smarter way to tackle this problem."
$endgroup$
– David C. Ullrich
Jan 23 at 15:44
$begingroup$
Well, I cannot speak for others but the fact is that you do not tackle the problem asked since your Y is not distributed like the X in the question.
$endgroup$
– Did
Jan 26 at 18:50
$begingroup$
Ok, what's the problem with that? The OP said " I would be equally grateful if someone provided me a quicker or smarter way to tackle this problem."
$endgroup$
– David C. Ullrich
Jan 23 at 15:44
$begingroup$
Ok, what's the problem with that? The OP said " I would be equally grateful if someone provided me a quicker or smarter way to tackle this problem."
$endgroup$
– David C. Ullrich
Jan 23 at 15:44
$begingroup$
Well, I cannot speak for others but the fact is that you do not tackle the problem asked since your Y is not distributed like the X in the question.
$endgroup$
– Did
Jan 26 at 18:50
$begingroup$
Well, I cannot speak for others but the fact is that you do not tackle the problem asked since your Y is not distributed like the X in the question.
$endgroup$
– Did
Jan 26 at 18:50
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3080126%2fexpectation-and-variance-of-the-number-of-elements-of-a-random-non-empty-set-sel%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
$begingroup$
Your first mistake (I did not look further) is when you write $$sum_{x=1}^{N-1}frac{N!}{(x-1)!(N-x-1)!} = sum_{x=0}^{N-2}frac{N!}{x!(N-x)!}$$ Yes one should replace $x$ by $x-1$ but then $(N-x-1)!$ becomes $(N-x-2)!$, not $(N-x)!$.
$endgroup$
– Did
Jan 22 at 19:54
1
$begingroup$
For an alternative proof, use the fact that $X$ equals in distribution $Y$ conditioned on $Yne0$, where $Y$ is binomial $(N,frac12)$. Thus, $$E(X)=E(Ymid Yne0)=frac{E(Y)}{1-p}qquad E(X^2)=E(Y^2mid Yne0)=frac{E(Y^2)}{1-p}$$ where $$p=P(Y=0)=2^{-N}$$ and $E(Y)$ and $E(Y^2)$ should be in your bag of known facts. Likewise, for every positive $k$, $$E(X^k)=frac{E(Y^k)}{1-p}$$
$endgroup$
– Did
Jan 22 at 19:57
$begingroup$
Did, can you add your comment as an answer so I can better understand it and vote it up?
$endgroup$
– user1337
Jan 26 at 17:42
$begingroup$
"However, the same does not apply to the variance when I make use of the results obtained" Again, we cannot know what and if you did something wrong since you neither show what you did nor what you got for the variance. The partial computations you do show indeed lead to $mathrm{Var}(X_N)sim N/4$.
$endgroup$
– Did
Jan 26 at 18:49