Conditional probability and binomial distributions: am I doing it right?












0












$begingroup$


We toss $n$ coins, and each one shows heads with probability $p$, independently of each of the others. Each coin which shows heads is tossed again. What is the mass function of the number of heads resulting from the second round of tosses?



MY ATTEMPT



Let $X$ has binomial distribution with parameters $n$ and $p$. Thus the probability that $0leq k leq n$ tosses show heads in the first round is given by



begin{align*}
textbf{P}(X = k) = {nchoose k}p^{k}(1-p)^{n-k}
end{align*}



Let $Y$ denotes the number of heads resulting from the second round of tosses. Such process can be formulated as follow
begin{align*}
textbf{P}(Y = mmid X = k) & = frac{textbf{P}({Y = m}cap{X = k})}{textbf{P}(X = k)} = frac{displaystyle{kchoose m}p^{m}(1-p)^{k-m}}{displaystyle{nchoose k}p^{k}(1-p)^{n-k}}
end{align*}



where $0leq mleq kleq n$.



Can someone double-check my reasoning? Thanks in advance.










share|cite|improve this question









$endgroup$

















    0












    $begingroup$


    We toss $n$ coins, and each one shows heads with probability $p$, independently of each of the others. Each coin which shows heads is tossed again. What is the mass function of the number of heads resulting from the second round of tosses?



    MY ATTEMPT



    Let $X$ has binomial distribution with parameters $n$ and $p$. Thus the probability that $0leq k leq n$ tosses show heads in the first round is given by



    begin{align*}
    textbf{P}(X = k) = {nchoose k}p^{k}(1-p)^{n-k}
    end{align*}



    Let $Y$ denotes the number of heads resulting from the second round of tosses. Such process can be formulated as follow
    begin{align*}
    textbf{P}(Y = mmid X = k) & = frac{textbf{P}({Y = m}cap{X = k})}{textbf{P}(X = k)} = frac{displaystyle{kchoose m}p^{m}(1-p)^{k-m}}{displaystyle{nchoose k}p^{k}(1-p)^{n-k}}
    end{align*}



    where $0leq mleq kleq n$.



    Can someone double-check my reasoning? Thanks in advance.










    share|cite|improve this question









    $endgroup$















      0












      0








      0





      $begingroup$


      We toss $n$ coins, and each one shows heads with probability $p$, independently of each of the others. Each coin which shows heads is tossed again. What is the mass function of the number of heads resulting from the second round of tosses?



      MY ATTEMPT



      Let $X$ has binomial distribution with parameters $n$ and $p$. Thus the probability that $0leq k leq n$ tosses show heads in the first round is given by



      begin{align*}
      textbf{P}(X = k) = {nchoose k}p^{k}(1-p)^{n-k}
      end{align*}



      Let $Y$ denotes the number of heads resulting from the second round of tosses. Such process can be formulated as follow
      begin{align*}
      textbf{P}(Y = mmid X = k) & = frac{textbf{P}({Y = m}cap{X = k})}{textbf{P}(X = k)} = frac{displaystyle{kchoose m}p^{m}(1-p)^{k-m}}{displaystyle{nchoose k}p^{k}(1-p)^{n-k}}
      end{align*}



      where $0leq mleq kleq n$.



      Can someone double-check my reasoning? Thanks in advance.










      share|cite|improve this question









      $endgroup$




      We toss $n$ coins, and each one shows heads with probability $p$, independently of each of the others. Each coin which shows heads is tossed again. What is the mass function of the number of heads resulting from the second round of tosses?



      MY ATTEMPT



      Let $X$ has binomial distribution with parameters $n$ and $p$. Thus the probability that $0leq k leq n$ tosses show heads in the first round is given by



      begin{align*}
      textbf{P}(X = k) = {nchoose k}p^{k}(1-p)^{n-k}
      end{align*}



      Let $Y$ denotes the number of heads resulting from the second round of tosses. Such process can be formulated as follow
      begin{align*}
      textbf{P}(Y = mmid X = k) & = frac{textbf{P}({Y = m}cap{X = k})}{textbf{P}(X = k)} = frac{displaystyle{kchoose m}p^{m}(1-p)^{k-m}}{displaystyle{nchoose k}p^{k}(1-p)^{n-k}}
      end{align*}



      where $0leq mleq kleq n$.



      Can someone double-check my reasoning? Thanks in advance.







      probability probability-theory proof-verification






      share|cite|improve this question













      share|cite|improve this question











      share|cite|improve this question




      share|cite|improve this question










      asked Jan 23 at 0:17









      user1337user1337

      46110




      46110






















          2 Answers
          2






          active

          oldest

          votes


















          1












          $begingroup$

          It appears that you have decided the probability mass function of $X$ is
          $P(X = k) = binom nk p^k(1-p)^{n-k}.$
          That is a correct result; good!



          Your working for the probability $P({Y = m}cap{X = k})$, however, appears to be incorrect.
          Think about what has to happen for the event ${Y = m}cap{X = k}$ to occur:
          first you must toss $n$ coins and $k$ of them must come up heads.
          (You already found the probability of that; it is $binom nk p^k(1-p)^{n-k}.$)
          Then you must toss $k$ coins and have $m$ of them come up heads;
          the probability of that (assuming you do toss $k$ coins)
          is $binom km p^m(1-p)^{k-m},$ which you have duly written in the numerator of your result. But what happened to the step where you tossed the $n$ coins and got $k$ heads? Did you just assume it already happened and its probability can be treated as $1$?



          The expression in the numerator of your result is not $P({Y = m}cap{X = k})$;
          it is, in fact, $P(Y = mmid X = k),$
          the probability of getting $m$ heads (on the second round) given that you toss $k$ coins (on the second round), that is, given that you got $k$ heads on the first round so that you toss $k$ heads on the second round.



          But we're not done yet, because I do not believe this problem is asking you to compute $P(Y = mmid X = k).$
          Does it say "given" anywhere after the phrase "the mass function of the number of heads resulting from the second round of tosses"?



          I believe you were asked to find $P(Y = m).$



          Now you can solve that problem the hard way, which is to take your values of
          $P(Y = mmid X = k)$ for each $k = 0, ldots, n$ (or just $k = m, ldots n$ since the probability is zero when $k < m$)
          and your values of $P(X=k)$ for each $k,$ write a big sum using the law of total probability, and add it up.



          Or you can solve it the easy way, which is to ask this:
          for each of the $n$ coins, how likely is it that this coin will contribute a head to the count of heads in the second round?
          That is, for coin number $i,$ where $i = 1, ldots, n,$
          let $Y_i = 1$ if the coin was tossed in the second round and came up heads,
          $Y_i = 0$ otherwise.
          Then the number of heads in the second round is $Y = sum_{i=1}^n Y_i.$



          In order for $Y_i=1$ to occur, coin $i$ must come up heads twice in a row: on the first toss and on the second toss.
          The probability of that is $p^2.$
          To have $Y_i=0$ we just need the coin to come up tails on the first toss
          or heads and then tails. The probability of that is $1 - p^2.$



          So $Y_i$ is a Bernoulli variable with success probability $p^2,$ and $Y$ is a binomial variable with parameters $n$ (number of trials) and $p^2$ (probability of success in each trial). Hence $Y$ has the probability mass function determined by the formula



          $$ P(Y = m) = binom nm (p^2)^m(1-p^2)^{n-m}. $$






          share|cite|improve this answer











          $endgroup$





















            1












            $begingroup$

            That's good, except that in the last step you equivocate on which
            should be the conditional probability: it is
            $$
            Pleft( {Y = m|X = k} right) = binom{k}{m}p^{,m} left( {1 - p} right)^{,k - m}
            $$

            because, in fact
            $$
            sumlimits_m {Pleft( {Y = m|X = k} right)} = 1
            $$



            So
            $$
            eqalign{
            & Pleft( {Y = m wedge X = k} right)
            = binom{k}{m}
            p^{,m} left( {1 - p} right)^{,k - m} left( matrix{
            n cr
            k cr} right)p^{,k} left( {1 - p} right)^{,n - k} = cr
            & = left( matrix{
            n cr
            k cr} right)left( matrix{
            k cr
            m cr} right)p^{,m + k} left( {1 - p} right)^{,n - m} cr}
            $$



            And the probability you are looking for is
            $$
            eqalign{
            & Pleft( {Y = m} right) = sumlimits_{left( {m, le } right),k,left( { le ,n} right)} {left( matrix{
            n cr
            k cr} right)left( matrix{
            k cr
            m cr} right)p^{,m + k} left( {1 - p} right)^{,n - m} } = cr
            & = sumlimits_{left( {m, le } right),k,left( { le ,n} right)} {left( matrix{
            n cr
            m cr} right)left( matrix{
            n - m cr
            k - m cr} right)p^{,m + k} left( {1 - p} right)^{,n - m} } = cr
            & = left( matrix{
            n cr
            m cr} right)left( {1 - p} right)^{,n - m} p^{,2m} sumlimits_{left( {0, le } right),k - m,left( { le ,n - m} right)} {left( matrix{
            n - m cr
            k - m cr} right)p^{,k - m} } = cr
            & = left( matrix{
            n cr
            m cr} right)left( {1 - p} right)^{,n - m} p^{,2m} left( {1 + p} right)^{,n - m} = cr
            & = left( matrix{
            n cr
            m cr} right)left( {p^{,2} } right)^m left( {1 - p^{,2} } right)^{,n - m} cr}
            $$






            share|cite|improve this answer









            $endgroup$













              Your Answer





              StackExchange.ifUsing("editor", function () {
              return StackExchange.using("mathjaxEditing", function () {
              StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
              StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
              });
              });
              }, "mathjax-editing");

              StackExchange.ready(function() {
              var channelOptions = {
              tags: "".split(" "),
              id: "69"
              };
              initTagRenderer("".split(" "), "".split(" "), channelOptions);

              StackExchange.using("externalEditor", function() {
              // Have to fire editor after snippets, if snippets enabled
              if (StackExchange.settings.snippets.snippetsEnabled) {
              StackExchange.using("snippets", function() {
              createEditor();
              });
              }
              else {
              createEditor();
              }
              });

              function createEditor() {
              StackExchange.prepareEditor({
              heartbeatType: 'answer',
              autoActivateHeartbeat: false,
              convertImagesToLinks: true,
              noModals: true,
              showLowRepImageUploadWarning: true,
              reputationToPostImages: 10,
              bindNavPrevention: true,
              postfix: "",
              imageUploader: {
              brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
              contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
              allowUrls: true
              },
              noCode: true, onDemand: true,
              discardSelector: ".discard-answer"
              ,immediatelyShowMarkdownHelp:true
              });


              }
              });














              draft saved

              draft discarded


















              StackExchange.ready(
              function () {
              StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3083916%2fconditional-probability-and-binomial-distributions-am-i-doing-it-right%23new-answer', 'question_page');
              }
              );

              Post as a guest















              Required, but never shown

























              2 Answers
              2






              active

              oldest

              votes








              2 Answers
              2






              active

              oldest

              votes









              active

              oldest

              votes






              active

              oldest

              votes









              1












              $begingroup$

              It appears that you have decided the probability mass function of $X$ is
              $P(X = k) = binom nk p^k(1-p)^{n-k}.$
              That is a correct result; good!



              Your working for the probability $P({Y = m}cap{X = k})$, however, appears to be incorrect.
              Think about what has to happen for the event ${Y = m}cap{X = k}$ to occur:
              first you must toss $n$ coins and $k$ of them must come up heads.
              (You already found the probability of that; it is $binom nk p^k(1-p)^{n-k}.$)
              Then you must toss $k$ coins and have $m$ of them come up heads;
              the probability of that (assuming you do toss $k$ coins)
              is $binom km p^m(1-p)^{k-m},$ which you have duly written in the numerator of your result. But what happened to the step where you tossed the $n$ coins and got $k$ heads? Did you just assume it already happened and its probability can be treated as $1$?



              The expression in the numerator of your result is not $P({Y = m}cap{X = k})$;
              it is, in fact, $P(Y = mmid X = k),$
              the probability of getting $m$ heads (on the second round) given that you toss $k$ coins (on the second round), that is, given that you got $k$ heads on the first round so that you toss $k$ heads on the second round.



              But we're not done yet, because I do not believe this problem is asking you to compute $P(Y = mmid X = k).$
              Does it say "given" anywhere after the phrase "the mass function of the number of heads resulting from the second round of tosses"?



              I believe you were asked to find $P(Y = m).$



              Now you can solve that problem the hard way, which is to take your values of
              $P(Y = mmid X = k)$ for each $k = 0, ldots, n$ (or just $k = m, ldots n$ since the probability is zero when $k < m$)
              and your values of $P(X=k)$ for each $k,$ write a big sum using the law of total probability, and add it up.



              Or you can solve it the easy way, which is to ask this:
              for each of the $n$ coins, how likely is it that this coin will contribute a head to the count of heads in the second round?
              That is, for coin number $i,$ where $i = 1, ldots, n,$
              let $Y_i = 1$ if the coin was tossed in the second round and came up heads,
              $Y_i = 0$ otherwise.
              Then the number of heads in the second round is $Y = sum_{i=1}^n Y_i.$



              In order for $Y_i=1$ to occur, coin $i$ must come up heads twice in a row: on the first toss and on the second toss.
              The probability of that is $p^2.$
              To have $Y_i=0$ we just need the coin to come up tails on the first toss
              or heads and then tails. The probability of that is $1 - p^2.$



              So $Y_i$ is a Bernoulli variable with success probability $p^2,$ and $Y$ is a binomial variable with parameters $n$ (number of trials) and $p^2$ (probability of success in each trial). Hence $Y$ has the probability mass function determined by the formula



              $$ P(Y = m) = binom nm (p^2)^m(1-p^2)^{n-m}. $$






              share|cite|improve this answer











              $endgroup$


















                1












                $begingroup$

                It appears that you have decided the probability mass function of $X$ is
                $P(X = k) = binom nk p^k(1-p)^{n-k}.$
                That is a correct result; good!



                Your working for the probability $P({Y = m}cap{X = k})$, however, appears to be incorrect.
                Think about what has to happen for the event ${Y = m}cap{X = k}$ to occur:
                first you must toss $n$ coins and $k$ of them must come up heads.
                (You already found the probability of that; it is $binom nk p^k(1-p)^{n-k}.$)
                Then you must toss $k$ coins and have $m$ of them come up heads;
                the probability of that (assuming you do toss $k$ coins)
                is $binom km p^m(1-p)^{k-m},$ which you have duly written in the numerator of your result. But what happened to the step where you tossed the $n$ coins and got $k$ heads? Did you just assume it already happened and its probability can be treated as $1$?



                The expression in the numerator of your result is not $P({Y = m}cap{X = k})$;
                it is, in fact, $P(Y = mmid X = k),$
                the probability of getting $m$ heads (on the second round) given that you toss $k$ coins (on the second round), that is, given that you got $k$ heads on the first round so that you toss $k$ heads on the second round.



                But we're not done yet, because I do not believe this problem is asking you to compute $P(Y = mmid X = k).$
                Does it say "given" anywhere after the phrase "the mass function of the number of heads resulting from the second round of tosses"?



                I believe you were asked to find $P(Y = m).$



                Now you can solve that problem the hard way, which is to take your values of
                $P(Y = mmid X = k)$ for each $k = 0, ldots, n$ (or just $k = m, ldots n$ since the probability is zero when $k < m$)
                and your values of $P(X=k)$ for each $k,$ write a big sum using the law of total probability, and add it up.



                Or you can solve it the easy way, which is to ask this:
                for each of the $n$ coins, how likely is it that this coin will contribute a head to the count of heads in the second round?
                That is, for coin number $i,$ where $i = 1, ldots, n,$
                let $Y_i = 1$ if the coin was tossed in the second round and came up heads,
                $Y_i = 0$ otherwise.
                Then the number of heads in the second round is $Y = sum_{i=1}^n Y_i.$



                In order for $Y_i=1$ to occur, coin $i$ must come up heads twice in a row: on the first toss and on the second toss.
                The probability of that is $p^2.$
                To have $Y_i=0$ we just need the coin to come up tails on the first toss
                or heads and then tails. The probability of that is $1 - p^2.$



                So $Y_i$ is a Bernoulli variable with success probability $p^2,$ and $Y$ is a binomial variable with parameters $n$ (number of trials) and $p^2$ (probability of success in each trial). Hence $Y$ has the probability mass function determined by the formula



                $$ P(Y = m) = binom nm (p^2)^m(1-p^2)^{n-m}. $$






                share|cite|improve this answer











                $endgroup$
















                  1












                  1








                  1





                  $begingroup$

                  It appears that you have decided the probability mass function of $X$ is
                  $P(X = k) = binom nk p^k(1-p)^{n-k}.$
                  That is a correct result; good!



                  Your working for the probability $P({Y = m}cap{X = k})$, however, appears to be incorrect.
                  Think about what has to happen for the event ${Y = m}cap{X = k}$ to occur:
                  first you must toss $n$ coins and $k$ of them must come up heads.
                  (You already found the probability of that; it is $binom nk p^k(1-p)^{n-k}.$)
                  Then you must toss $k$ coins and have $m$ of them come up heads;
                  the probability of that (assuming you do toss $k$ coins)
                  is $binom km p^m(1-p)^{k-m},$ which you have duly written in the numerator of your result. But what happened to the step where you tossed the $n$ coins and got $k$ heads? Did you just assume it already happened and its probability can be treated as $1$?



                  The expression in the numerator of your result is not $P({Y = m}cap{X = k})$;
                  it is, in fact, $P(Y = mmid X = k),$
                  the probability of getting $m$ heads (on the second round) given that you toss $k$ coins (on the second round), that is, given that you got $k$ heads on the first round so that you toss $k$ heads on the second round.



                  But we're not done yet, because I do not believe this problem is asking you to compute $P(Y = mmid X = k).$
                  Does it say "given" anywhere after the phrase "the mass function of the number of heads resulting from the second round of tosses"?



                  I believe you were asked to find $P(Y = m).$



                  Now you can solve that problem the hard way, which is to take your values of
                  $P(Y = mmid X = k)$ for each $k = 0, ldots, n$ (or just $k = m, ldots n$ since the probability is zero when $k < m$)
                  and your values of $P(X=k)$ for each $k,$ write a big sum using the law of total probability, and add it up.



                  Or you can solve it the easy way, which is to ask this:
                  for each of the $n$ coins, how likely is it that this coin will contribute a head to the count of heads in the second round?
                  That is, for coin number $i,$ where $i = 1, ldots, n,$
                  let $Y_i = 1$ if the coin was tossed in the second round and came up heads,
                  $Y_i = 0$ otherwise.
                  Then the number of heads in the second round is $Y = sum_{i=1}^n Y_i.$



                  In order for $Y_i=1$ to occur, coin $i$ must come up heads twice in a row: on the first toss and on the second toss.
                  The probability of that is $p^2.$
                  To have $Y_i=0$ we just need the coin to come up tails on the first toss
                  or heads and then tails. The probability of that is $1 - p^2.$



                  So $Y_i$ is a Bernoulli variable with success probability $p^2,$ and $Y$ is a binomial variable with parameters $n$ (number of trials) and $p^2$ (probability of success in each trial). Hence $Y$ has the probability mass function determined by the formula



                  $$ P(Y = m) = binom nm (p^2)^m(1-p^2)^{n-m}. $$






                  share|cite|improve this answer











                  $endgroup$



                  It appears that you have decided the probability mass function of $X$ is
                  $P(X = k) = binom nk p^k(1-p)^{n-k}.$
                  That is a correct result; good!



                  Your working for the probability $P({Y = m}cap{X = k})$, however, appears to be incorrect.
                  Think about what has to happen for the event ${Y = m}cap{X = k}$ to occur:
                  first you must toss $n$ coins and $k$ of them must come up heads.
                  (You already found the probability of that; it is $binom nk p^k(1-p)^{n-k}.$)
                  Then you must toss $k$ coins and have $m$ of them come up heads;
                  the probability of that (assuming you do toss $k$ coins)
                  is $binom km p^m(1-p)^{k-m},$ which you have duly written in the numerator of your result. But what happened to the step where you tossed the $n$ coins and got $k$ heads? Did you just assume it already happened and its probability can be treated as $1$?



                  The expression in the numerator of your result is not $P({Y = m}cap{X = k})$;
                  it is, in fact, $P(Y = mmid X = k),$
                  the probability of getting $m$ heads (on the second round) given that you toss $k$ coins (on the second round), that is, given that you got $k$ heads on the first round so that you toss $k$ heads on the second round.



                  But we're not done yet, because I do not believe this problem is asking you to compute $P(Y = mmid X = k).$
                  Does it say "given" anywhere after the phrase "the mass function of the number of heads resulting from the second round of tosses"?



                  I believe you were asked to find $P(Y = m).$



                  Now you can solve that problem the hard way, which is to take your values of
                  $P(Y = mmid X = k)$ for each $k = 0, ldots, n$ (or just $k = m, ldots n$ since the probability is zero when $k < m$)
                  and your values of $P(X=k)$ for each $k,$ write a big sum using the law of total probability, and add it up.



                  Or you can solve it the easy way, which is to ask this:
                  for each of the $n$ coins, how likely is it that this coin will contribute a head to the count of heads in the second round?
                  That is, for coin number $i,$ where $i = 1, ldots, n,$
                  let $Y_i = 1$ if the coin was tossed in the second round and came up heads,
                  $Y_i = 0$ otherwise.
                  Then the number of heads in the second round is $Y = sum_{i=1}^n Y_i.$



                  In order for $Y_i=1$ to occur, coin $i$ must come up heads twice in a row: on the first toss and on the second toss.
                  The probability of that is $p^2.$
                  To have $Y_i=0$ we just need the coin to come up tails on the first toss
                  or heads and then tails. The probability of that is $1 - p^2.$



                  So $Y_i$ is a Bernoulli variable with success probability $p^2,$ and $Y$ is a binomial variable with parameters $n$ (number of trials) and $p^2$ (probability of success in each trial). Hence $Y$ has the probability mass function determined by the formula



                  $$ P(Y = m) = binom nm (p^2)^m(1-p^2)^{n-m}. $$







                  share|cite|improve this answer














                  share|cite|improve this answer



                  share|cite|improve this answer








                  edited Jan 23 at 14:30

























                  answered Jan 23 at 1:52









                  David KDavid K

                  55k344120




                  55k344120























                      1












                      $begingroup$

                      That's good, except that in the last step you equivocate on which
                      should be the conditional probability: it is
                      $$
                      Pleft( {Y = m|X = k} right) = binom{k}{m}p^{,m} left( {1 - p} right)^{,k - m}
                      $$

                      because, in fact
                      $$
                      sumlimits_m {Pleft( {Y = m|X = k} right)} = 1
                      $$



                      So
                      $$
                      eqalign{
                      & Pleft( {Y = m wedge X = k} right)
                      = binom{k}{m}
                      p^{,m} left( {1 - p} right)^{,k - m} left( matrix{
                      n cr
                      k cr} right)p^{,k} left( {1 - p} right)^{,n - k} = cr
                      & = left( matrix{
                      n cr
                      k cr} right)left( matrix{
                      k cr
                      m cr} right)p^{,m + k} left( {1 - p} right)^{,n - m} cr}
                      $$



                      And the probability you are looking for is
                      $$
                      eqalign{
                      & Pleft( {Y = m} right) = sumlimits_{left( {m, le } right),k,left( { le ,n} right)} {left( matrix{
                      n cr
                      k cr} right)left( matrix{
                      k cr
                      m cr} right)p^{,m + k} left( {1 - p} right)^{,n - m} } = cr
                      & = sumlimits_{left( {m, le } right),k,left( { le ,n} right)} {left( matrix{
                      n cr
                      m cr} right)left( matrix{
                      n - m cr
                      k - m cr} right)p^{,m + k} left( {1 - p} right)^{,n - m} } = cr
                      & = left( matrix{
                      n cr
                      m cr} right)left( {1 - p} right)^{,n - m} p^{,2m} sumlimits_{left( {0, le } right),k - m,left( { le ,n - m} right)} {left( matrix{
                      n - m cr
                      k - m cr} right)p^{,k - m} } = cr
                      & = left( matrix{
                      n cr
                      m cr} right)left( {1 - p} right)^{,n - m} p^{,2m} left( {1 + p} right)^{,n - m} = cr
                      & = left( matrix{
                      n cr
                      m cr} right)left( {p^{,2} } right)^m left( {1 - p^{,2} } right)^{,n - m} cr}
                      $$






                      share|cite|improve this answer









                      $endgroup$


















                        1












                        $begingroup$

                        That's good, except that in the last step you equivocate on which
                        should be the conditional probability: it is
                        $$
                        Pleft( {Y = m|X = k} right) = binom{k}{m}p^{,m} left( {1 - p} right)^{,k - m}
                        $$

                        because, in fact
                        $$
                        sumlimits_m {Pleft( {Y = m|X = k} right)} = 1
                        $$



                        So
                        $$
                        eqalign{
                        & Pleft( {Y = m wedge X = k} right)
                        = binom{k}{m}
                        p^{,m} left( {1 - p} right)^{,k - m} left( matrix{
                        n cr
                        k cr} right)p^{,k} left( {1 - p} right)^{,n - k} = cr
                        & = left( matrix{
                        n cr
                        k cr} right)left( matrix{
                        k cr
                        m cr} right)p^{,m + k} left( {1 - p} right)^{,n - m} cr}
                        $$



                        And the probability you are looking for is
                        $$
                        eqalign{
                        & Pleft( {Y = m} right) = sumlimits_{left( {m, le } right),k,left( { le ,n} right)} {left( matrix{
                        n cr
                        k cr} right)left( matrix{
                        k cr
                        m cr} right)p^{,m + k} left( {1 - p} right)^{,n - m} } = cr
                        & = sumlimits_{left( {m, le } right),k,left( { le ,n} right)} {left( matrix{
                        n cr
                        m cr} right)left( matrix{
                        n - m cr
                        k - m cr} right)p^{,m + k} left( {1 - p} right)^{,n - m} } = cr
                        & = left( matrix{
                        n cr
                        m cr} right)left( {1 - p} right)^{,n - m} p^{,2m} sumlimits_{left( {0, le } right),k - m,left( { le ,n - m} right)} {left( matrix{
                        n - m cr
                        k - m cr} right)p^{,k - m} } = cr
                        & = left( matrix{
                        n cr
                        m cr} right)left( {1 - p} right)^{,n - m} p^{,2m} left( {1 + p} right)^{,n - m} = cr
                        & = left( matrix{
                        n cr
                        m cr} right)left( {p^{,2} } right)^m left( {1 - p^{,2} } right)^{,n - m} cr}
                        $$






                        share|cite|improve this answer









                        $endgroup$
















                          1












                          1








                          1





                          $begingroup$

                          That's good, except that in the last step you equivocate on which
                          should be the conditional probability: it is
                          $$
                          Pleft( {Y = m|X = k} right) = binom{k}{m}p^{,m} left( {1 - p} right)^{,k - m}
                          $$

                          because, in fact
                          $$
                          sumlimits_m {Pleft( {Y = m|X = k} right)} = 1
                          $$



                          So
                          $$
                          eqalign{
                          & Pleft( {Y = m wedge X = k} right)
                          = binom{k}{m}
                          p^{,m} left( {1 - p} right)^{,k - m} left( matrix{
                          n cr
                          k cr} right)p^{,k} left( {1 - p} right)^{,n - k} = cr
                          & = left( matrix{
                          n cr
                          k cr} right)left( matrix{
                          k cr
                          m cr} right)p^{,m + k} left( {1 - p} right)^{,n - m} cr}
                          $$



                          And the probability you are looking for is
                          $$
                          eqalign{
                          & Pleft( {Y = m} right) = sumlimits_{left( {m, le } right),k,left( { le ,n} right)} {left( matrix{
                          n cr
                          k cr} right)left( matrix{
                          k cr
                          m cr} right)p^{,m + k} left( {1 - p} right)^{,n - m} } = cr
                          & = sumlimits_{left( {m, le } right),k,left( { le ,n} right)} {left( matrix{
                          n cr
                          m cr} right)left( matrix{
                          n - m cr
                          k - m cr} right)p^{,m + k} left( {1 - p} right)^{,n - m} } = cr
                          & = left( matrix{
                          n cr
                          m cr} right)left( {1 - p} right)^{,n - m} p^{,2m} sumlimits_{left( {0, le } right),k - m,left( { le ,n - m} right)} {left( matrix{
                          n - m cr
                          k - m cr} right)p^{,k - m} } = cr
                          & = left( matrix{
                          n cr
                          m cr} right)left( {1 - p} right)^{,n - m} p^{,2m} left( {1 + p} right)^{,n - m} = cr
                          & = left( matrix{
                          n cr
                          m cr} right)left( {p^{,2} } right)^m left( {1 - p^{,2} } right)^{,n - m} cr}
                          $$






                          share|cite|improve this answer









                          $endgroup$



                          That's good, except that in the last step you equivocate on which
                          should be the conditional probability: it is
                          $$
                          Pleft( {Y = m|X = k} right) = binom{k}{m}p^{,m} left( {1 - p} right)^{,k - m}
                          $$

                          because, in fact
                          $$
                          sumlimits_m {Pleft( {Y = m|X = k} right)} = 1
                          $$



                          So
                          $$
                          eqalign{
                          & Pleft( {Y = m wedge X = k} right)
                          = binom{k}{m}
                          p^{,m} left( {1 - p} right)^{,k - m} left( matrix{
                          n cr
                          k cr} right)p^{,k} left( {1 - p} right)^{,n - k} = cr
                          & = left( matrix{
                          n cr
                          k cr} right)left( matrix{
                          k cr
                          m cr} right)p^{,m + k} left( {1 - p} right)^{,n - m} cr}
                          $$



                          And the probability you are looking for is
                          $$
                          eqalign{
                          & Pleft( {Y = m} right) = sumlimits_{left( {m, le } right),k,left( { le ,n} right)} {left( matrix{
                          n cr
                          k cr} right)left( matrix{
                          k cr
                          m cr} right)p^{,m + k} left( {1 - p} right)^{,n - m} } = cr
                          & = sumlimits_{left( {m, le } right),k,left( { le ,n} right)} {left( matrix{
                          n cr
                          m cr} right)left( matrix{
                          n - m cr
                          k - m cr} right)p^{,m + k} left( {1 - p} right)^{,n - m} } = cr
                          & = left( matrix{
                          n cr
                          m cr} right)left( {1 - p} right)^{,n - m} p^{,2m} sumlimits_{left( {0, le } right),k - m,left( { le ,n - m} right)} {left( matrix{
                          n - m cr
                          k - m cr} right)p^{,k - m} } = cr
                          & = left( matrix{
                          n cr
                          m cr} right)left( {1 - p} right)^{,n - m} p^{,2m} left( {1 + p} right)^{,n - m} = cr
                          & = left( matrix{
                          n cr
                          m cr} right)left( {p^{,2} } right)^m left( {1 - p^{,2} } right)^{,n - m} cr}
                          $$







                          share|cite|improve this answer












                          share|cite|improve this answer



                          share|cite|improve this answer










                          answered Jan 23 at 0:46









                          G CabG Cab

                          19.9k31340




                          19.9k31340






























                              draft saved

                              draft discarded




















































                              Thanks for contributing an answer to Mathematics Stack Exchange!


                              • Please be sure to answer the question. Provide details and share your research!

                              But avoid



                              • Asking for help, clarification, or responding to other answers.

                              • Making statements based on opinion; back them up with references or personal experience.


                              Use MathJax to format equations. MathJax reference.


                              To learn more, see our tips on writing great answers.




                              draft saved


                              draft discarded














                              StackExchange.ready(
                              function () {
                              StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3083916%2fconditional-probability-and-binomial-distributions-am-i-doing-it-right%23new-answer', 'question_page');
                              }
                              );

                              Post as a guest















                              Required, but never shown





















































                              Required, but never shown














                              Required, but never shown












                              Required, but never shown







                              Required, but never shown

































                              Required, but never shown














                              Required, but never shown












                              Required, but never shown







                              Required, but never shown







                              Popular posts from this blog

                              MongoDB - Not Authorized To Execute Command

                              in spring boot 2.1 many test slices are not allowed anymore due to multiple @BootstrapWith

                              How to fix TextFormField cause rebuild widget in Flutter