What is the probability that the sent codeword will be eventually decoded correctly












1












$begingroup$


I seem to be stuck on the following problem:



Let $C={000,111,222}$ be a ternary code which is sent over a symmetric channel with symbol-error probability $p$.



We use an error detection system, so that if the received word is not a codeword in $C$, we request a retransmission.



The codeword $000in C$ is sent. What is the probability that we eventually decode the codeword correctly, perhaps after several retransmissions?



Here is the progress I have made so far:



The probability that we decode correctly upon receiving the first transmission is the probability that no errors occur, i.e. $(1-p)^3$. The probability that we decode correctly upon receiving the second transmission given that the first transmission was rejected is $(1-p)^3(3p-3p^2+frac{3}{4}p^3)$, where the $(3p-3p^2+frac{3}{4}p^3)$ term comes from the fact that the probability that a codeword is rejected is one minus the probability that no errors occurred during the first transmission plus the probability that $000$ was changed to $111$ or $222$ as a result of errors introduced during the first transmission, which is: $(1-p)^3-frac{1}{4}p^3=1-3p+3p^2-frac{3}{4}p^3$. Likewise, the probability that the codeword is decoded correctly upon receiving the third transmission given that the previous two were rejected is $(1-p)^3(3p-3p^2+frac{3}{4}p^3)^2$ and so on. My initial idea was to simply extrapolate this to the general case; that the probability that the codeword was decoded correctly upon receiving the $(n+1)$'th transmission given that the previous $n$ transmissions were rejected is $(1-p)^3(3p-3p^2+frac{3}{4}p^3)^n$, and then taking the sum from zero to infinity, which in my mind would yield: $sum_{n=0}^{infty}(1-p)^3(3p-3p^2+frac{3}{4}p^3)^n=(1-p)^3sum_{n=0}^{infty}(3p-3p^2+frac{3}{4}p^3)^n=(1-p)^3(frac{1}{(1-p)^3+frac{1}{4}p^3})=(1-p)^3(frac{1}{(1-p)^3+frac{1}{4}p^3})=(frac{1}{1+frac{1}{4}(frac{p}{1-p})^3})$. Needless to say, I immediately recognised the fallaciouness of this argument as the events are not independent.



Now I find myself in a position in which I believe that the problem does admit a simple solution, which would be immediately recognised by a mind more lucid than mine. All help and input would, as always, be highly appreciated.










share|cite|improve this question









$endgroup$

















    1












    $begingroup$


    I seem to be stuck on the following problem:



    Let $C={000,111,222}$ be a ternary code which is sent over a symmetric channel with symbol-error probability $p$.



    We use an error detection system, so that if the received word is not a codeword in $C$, we request a retransmission.



    The codeword $000in C$ is sent. What is the probability that we eventually decode the codeword correctly, perhaps after several retransmissions?



    Here is the progress I have made so far:



    The probability that we decode correctly upon receiving the first transmission is the probability that no errors occur, i.e. $(1-p)^3$. The probability that we decode correctly upon receiving the second transmission given that the first transmission was rejected is $(1-p)^3(3p-3p^2+frac{3}{4}p^3)$, where the $(3p-3p^2+frac{3}{4}p^3)$ term comes from the fact that the probability that a codeword is rejected is one minus the probability that no errors occurred during the first transmission plus the probability that $000$ was changed to $111$ or $222$ as a result of errors introduced during the first transmission, which is: $(1-p)^3-frac{1}{4}p^3=1-3p+3p^2-frac{3}{4}p^3$. Likewise, the probability that the codeword is decoded correctly upon receiving the third transmission given that the previous two were rejected is $(1-p)^3(3p-3p^2+frac{3}{4}p^3)^2$ and so on. My initial idea was to simply extrapolate this to the general case; that the probability that the codeword was decoded correctly upon receiving the $(n+1)$'th transmission given that the previous $n$ transmissions were rejected is $(1-p)^3(3p-3p^2+frac{3}{4}p^3)^n$, and then taking the sum from zero to infinity, which in my mind would yield: $sum_{n=0}^{infty}(1-p)^3(3p-3p^2+frac{3}{4}p^3)^n=(1-p)^3sum_{n=0}^{infty}(3p-3p^2+frac{3}{4}p^3)^n=(1-p)^3(frac{1}{(1-p)^3+frac{1}{4}p^3})=(1-p)^3(frac{1}{(1-p)^3+frac{1}{4}p^3})=(frac{1}{1+frac{1}{4}(frac{p}{1-p})^3})$. Needless to say, I immediately recognised the fallaciouness of this argument as the events are not independent.



    Now I find myself in a position in which I believe that the problem does admit a simple solution, which would be immediately recognised by a mind more lucid than mine. All help and input would, as always, be highly appreciated.










    share|cite|improve this question









    $endgroup$















      1












      1








      1





      $begingroup$


      I seem to be stuck on the following problem:



      Let $C={000,111,222}$ be a ternary code which is sent over a symmetric channel with symbol-error probability $p$.



      We use an error detection system, so that if the received word is not a codeword in $C$, we request a retransmission.



      The codeword $000in C$ is sent. What is the probability that we eventually decode the codeword correctly, perhaps after several retransmissions?



      Here is the progress I have made so far:



      The probability that we decode correctly upon receiving the first transmission is the probability that no errors occur, i.e. $(1-p)^3$. The probability that we decode correctly upon receiving the second transmission given that the first transmission was rejected is $(1-p)^3(3p-3p^2+frac{3}{4}p^3)$, where the $(3p-3p^2+frac{3}{4}p^3)$ term comes from the fact that the probability that a codeword is rejected is one minus the probability that no errors occurred during the first transmission plus the probability that $000$ was changed to $111$ or $222$ as a result of errors introduced during the first transmission, which is: $(1-p)^3-frac{1}{4}p^3=1-3p+3p^2-frac{3}{4}p^3$. Likewise, the probability that the codeword is decoded correctly upon receiving the third transmission given that the previous two were rejected is $(1-p)^3(3p-3p^2+frac{3}{4}p^3)^2$ and so on. My initial idea was to simply extrapolate this to the general case; that the probability that the codeword was decoded correctly upon receiving the $(n+1)$'th transmission given that the previous $n$ transmissions were rejected is $(1-p)^3(3p-3p^2+frac{3}{4}p^3)^n$, and then taking the sum from zero to infinity, which in my mind would yield: $sum_{n=0}^{infty}(1-p)^3(3p-3p^2+frac{3}{4}p^3)^n=(1-p)^3sum_{n=0}^{infty}(3p-3p^2+frac{3}{4}p^3)^n=(1-p)^3(frac{1}{(1-p)^3+frac{1}{4}p^3})=(1-p)^3(frac{1}{(1-p)^3+frac{1}{4}p^3})=(frac{1}{1+frac{1}{4}(frac{p}{1-p})^3})$. Needless to say, I immediately recognised the fallaciouness of this argument as the events are not independent.



      Now I find myself in a position in which I believe that the problem does admit a simple solution, which would be immediately recognised by a mind more lucid than mine. All help and input would, as always, be highly appreciated.










      share|cite|improve this question









      $endgroup$




      I seem to be stuck on the following problem:



      Let $C={000,111,222}$ be a ternary code which is sent over a symmetric channel with symbol-error probability $p$.



      We use an error detection system, so that if the received word is not a codeword in $C$, we request a retransmission.



      The codeword $000in C$ is sent. What is the probability that we eventually decode the codeword correctly, perhaps after several retransmissions?



      Here is the progress I have made so far:



      The probability that we decode correctly upon receiving the first transmission is the probability that no errors occur, i.e. $(1-p)^3$. The probability that we decode correctly upon receiving the second transmission given that the first transmission was rejected is $(1-p)^3(3p-3p^2+frac{3}{4}p^3)$, where the $(3p-3p^2+frac{3}{4}p^3)$ term comes from the fact that the probability that a codeword is rejected is one minus the probability that no errors occurred during the first transmission plus the probability that $000$ was changed to $111$ or $222$ as a result of errors introduced during the first transmission, which is: $(1-p)^3-frac{1}{4}p^3=1-3p+3p^2-frac{3}{4}p^3$. Likewise, the probability that the codeword is decoded correctly upon receiving the third transmission given that the previous two were rejected is $(1-p)^3(3p-3p^2+frac{3}{4}p^3)^2$ and so on. My initial idea was to simply extrapolate this to the general case; that the probability that the codeword was decoded correctly upon receiving the $(n+1)$'th transmission given that the previous $n$ transmissions were rejected is $(1-p)^3(3p-3p^2+frac{3}{4}p^3)^n$, and then taking the sum from zero to infinity, which in my mind would yield: $sum_{n=0}^{infty}(1-p)^3(3p-3p^2+frac{3}{4}p^3)^n=(1-p)^3sum_{n=0}^{infty}(3p-3p^2+frac{3}{4}p^3)^n=(1-p)^3(frac{1}{(1-p)^3+frac{1}{4}p^3})=(1-p)^3(frac{1}{(1-p)^3+frac{1}{4}p^3})=(frac{1}{1+frac{1}{4}(frac{p}{1-p})^3})$. Needless to say, I immediately recognised the fallaciouness of this argument as the events are not independent.



      Now I find myself in a position in which I believe that the problem does admit a simple solution, which would be immediately recognised by a mind more lucid than mine. All help and input would, as always, be highly appreciated.







      probability coding-theory






      share|cite|improve this question













      share|cite|improve this question











      share|cite|improve this question




      share|cite|improve this question










      asked Jan 27 at 21:16









      Heinrich WagnerHeinrich Wagner

      450211




      450211






















          1 Answer
          1






          active

          oldest

          votes


















          1












          $begingroup$

          The calculation is correct, there is no problem with independece.



          Summing probabilities is correct if the events under consideration are disjoint: $P(X cup Y) = P(X)+P(Y)$ if $X$ and $Y$ cannot happen at the same time. In your case, $X$ is the event that we end after $x$ retransmissions and $Y$ that we end after $y$ retransmissions.



          Independence comes into play only when we don't have disjointess; the above formula becomes



          $$P(X cup Y) = P(X)+P(Y) - P(X cap Y)$$



          If we know that $X$ and $Y$ are independent, then we have $P(X cap Y)=P(X)P(Y)$.
          But this is not necessary here, disjoint events are even simpler as $P(X cap Y)=0$ since $X$ and $Y$ cannot both happen.



          ++++++++++++++++++



          I'll give another solution that avoids the inifinite sum by some 'clever' argument:



          In any given (re)transmission, there are 3 events relevant for the problem:



          $E_1$: We decode as "$000$",
          $E_2$: We decode as "$000$" or "$111$" or "$222$" (so a word from $C$),
          $E_3$: We decode as any other word (so a word not in $C$).



          The probability that event 1 happens is correctly calculated in the original post as



          $$p_1=P(E_1)=(1-p)^3$$



          The probability that event 2 happens is also correctly calculated (with a small clerical sign error) in the original post as



          $$p_2=P(E_2)=(1-p)^3 + frac18p^3 + frac18p^3 = (1-p)^3 + frac14p^3 = 1-3p+3p^2-frac34p^3$$



          Events 2 and 3 are complementary, so we have



          $$p_3=P(E_3)=1-p_2=3p-3p^2+frac34p^3$$



          which also appears in the original post.



          In case of event 3, we ask for a retransmission and everything we did up to now is irrelevant: The retransmission will decide what happens (maybe involving even more retransmissions).



          So the decoded word is finally accepted when we get event 2 in any transmission; if this is also event 1, we decode correctly. Since the previous transmissions are not important now (they only influenced the fact that we got to this transmission, but they did not affect what happened in this transmission), the probability that we decode correctly ($p_{000}$) is that we get event 1 under the condition that we have gotten event 2:



          $$p_{000}=P(E_1|E_2) = frac{P(E_1 cap E_2)}{P(E_2)}= frac{P(E_1)}{P(E_2)}=frac{p_1}{p_2}=frac{(1-p)^3}{(1-p)^3 + frac14p^3}$$






          share|cite|improve this answer









          $endgroup$













          • $begingroup$
            You made it all very clear to me. I especially like your alternative and far more elegant solution. Thank you for your help.
            $endgroup$
            – Heinrich Wagner
            Jan 28 at 14:16











          Your Answer





          StackExchange.ifUsing("editor", function () {
          return StackExchange.using("mathjaxEditing", function () {
          StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
          StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
          });
          });
          }, "mathjax-editing");

          StackExchange.ready(function() {
          var channelOptions = {
          tags: "".split(" "),
          id: "69"
          };
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function() {
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled) {
          StackExchange.using("snippets", function() {
          createEditor();
          });
          }
          else {
          createEditor();
          }
          });

          function createEditor() {
          StackExchange.prepareEditor({
          heartbeatType: 'answer',
          autoActivateHeartbeat: false,
          convertImagesToLinks: true,
          noModals: true,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: 10,
          bindNavPrevention: true,
          postfix: "",
          imageUploader: {
          brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
          contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
          allowUrls: true
          },
          noCode: true, onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          });


          }
          });














          draft saved

          draft discarded


















          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3090133%2fwhat-is-the-probability-that-the-sent-codeword-will-be-eventually-decoded-correc%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown

























          1 Answer
          1






          active

          oldest

          votes








          1 Answer
          1






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes









          1












          $begingroup$

          The calculation is correct, there is no problem with independece.



          Summing probabilities is correct if the events under consideration are disjoint: $P(X cup Y) = P(X)+P(Y)$ if $X$ and $Y$ cannot happen at the same time. In your case, $X$ is the event that we end after $x$ retransmissions and $Y$ that we end after $y$ retransmissions.



          Independence comes into play only when we don't have disjointess; the above formula becomes



          $$P(X cup Y) = P(X)+P(Y) - P(X cap Y)$$



          If we know that $X$ and $Y$ are independent, then we have $P(X cap Y)=P(X)P(Y)$.
          But this is not necessary here, disjoint events are even simpler as $P(X cap Y)=0$ since $X$ and $Y$ cannot both happen.



          ++++++++++++++++++



          I'll give another solution that avoids the inifinite sum by some 'clever' argument:



          In any given (re)transmission, there are 3 events relevant for the problem:



          $E_1$: We decode as "$000$",
          $E_2$: We decode as "$000$" or "$111$" or "$222$" (so a word from $C$),
          $E_3$: We decode as any other word (so a word not in $C$).



          The probability that event 1 happens is correctly calculated in the original post as



          $$p_1=P(E_1)=(1-p)^3$$



          The probability that event 2 happens is also correctly calculated (with a small clerical sign error) in the original post as



          $$p_2=P(E_2)=(1-p)^3 + frac18p^3 + frac18p^3 = (1-p)^3 + frac14p^3 = 1-3p+3p^2-frac34p^3$$



          Events 2 and 3 are complementary, so we have



          $$p_3=P(E_3)=1-p_2=3p-3p^2+frac34p^3$$



          which also appears in the original post.



          In case of event 3, we ask for a retransmission and everything we did up to now is irrelevant: The retransmission will decide what happens (maybe involving even more retransmissions).



          So the decoded word is finally accepted when we get event 2 in any transmission; if this is also event 1, we decode correctly. Since the previous transmissions are not important now (they only influenced the fact that we got to this transmission, but they did not affect what happened in this transmission), the probability that we decode correctly ($p_{000}$) is that we get event 1 under the condition that we have gotten event 2:



          $$p_{000}=P(E_1|E_2) = frac{P(E_1 cap E_2)}{P(E_2)}= frac{P(E_1)}{P(E_2)}=frac{p_1}{p_2}=frac{(1-p)^3}{(1-p)^3 + frac14p^3}$$






          share|cite|improve this answer









          $endgroup$













          • $begingroup$
            You made it all very clear to me. I especially like your alternative and far more elegant solution. Thank you for your help.
            $endgroup$
            – Heinrich Wagner
            Jan 28 at 14:16
















          1












          $begingroup$

          The calculation is correct, there is no problem with independece.



          Summing probabilities is correct if the events under consideration are disjoint: $P(X cup Y) = P(X)+P(Y)$ if $X$ and $Y$ cannot happen at the same time. In your case, $X$ is the event that we end after $x$ retransmissions and $Y$ that we end after $y$ retransmissions.



          Independence comes into play only when we don't have disjointess; the above formula becomes



          $$P(X cup Y) = P(X)+P(Y) - P(X cap Y)$$



          If we know that $X$ and $Y$ are independent, then we have $P(X cap Y)=P(X)P(Y)$.
          But this is not necessary here, disjoint events are even simpler as $P(X cap Y)=0$ since $X$ and $Y$ cannot both happen.



          ++++++++++++++++++



          I'll give another solution that avoids the inifinite sum by some 'clever' argument:



          In any given (re)transmission, there are 3 events relevant for the problem:



          $E_1$: We decode as "$000$",
          $E_2$: We decode as "$000$" or "$111$" or "$222$" (so a word from $C$),
          $E_3$: We decode as any other word (so a word not in $C$).



          The probability that event 1 happens is correctly calculated in the original post as



          $$p_1=P(E_1)=(1-p)^3$$



          The probability that event 2 happens is also correctly calculated (with a small clerical sign error) in the original post as



          $$p_2=P(E_2)=(1-p)^3 + frac18p^3 + frac18p^3 = (1-p)^3 + frac14p^3 = 1-3p+3p^2-frac34p^3$$



          Events 2 and 3 are complementary, so we have



          $$p_3=P(E_3)=1-p_2=3p-3p^2+frac34p^3$$



          which also appears in the original post.



          In case of event 3, we ask for a retransmission and everything we did up to now is irrelevant: The retransmission will decide what happens (maybe involving even more retransmissions).



          So the decoded word is finally accepted when we get event 2 in any transmission; if this is also event 1, we decode correctly. Since the previous transmissions are not important now (they only influenced the fact that we got to this transmission, but they did not affect what happened in this transmission), the probability that we decode correctly ($p_{000}$) is that we get event 1 under the condition that we have gotten event 2:



          $$p_{000}=P(E_1|E_2) = frac{P(E_1 cap E_2)}{P(E_2)}= frac{P(E_1)}{P(E_2)}=frac{p_1}{p_2}=frac{(1-p)^3}{(1-p)^3 + frac14p^3}$$






          share|cite|improve this answer









          $endgroup$













          • $begingroup$
            You made it all very clear to me. I especially like your alternative and far more elegant solution. Thank you for your help.
            $endgroup$
            – Heinrich Wagner
            Jan 28 at 14:16














          1












          1








          1





          $begingroup$

          The calculation is correct, there is no problem with independece.



          Summing probabilities is correct if the events under consideration are disjoint: $P(X cup Y) = P(X)+P(Y)$ if $X$ and $Y$ cannot happen at the same time. In your case, $X$ is the event that we end after $x$ retransmissions and $Y$ that we end after $y$ retransmissions.



          Independence comes into play only when we don't have disjointess; the above formula becomes



          $$P(X cup Y) = P(X)+P(Y) - P(X cap Y)$$



          If we know that $X$ and $Y$ are independent, then we have $P(X cap Y)=P(X)P(Y)$.
          But this is not necessary here, disjoint events are even simpler as $P(X cap Y)=0$ since $X$ and $Y$ cannot both happen.



          ++++++++++++++++++



          I'll give another solution that avoids the inifinite sum by some 'clever' argument:



          In any given (re)transmission, there are 3 events relevant for the problem:



          $E_1$: We decode as "$000$",
          $E_2$: We decode as "$000$" or "$111$" or "$222$" (so a word from $C$),
          $E_3$: We decode as any other word (so a word not in $C$).



          The probability that event 1 happens is correctly calculated in the original post as



          $$p_1=P(E_1)=(1-p)^3$$



          The probability that event 2 happens is also correctly calculated (with a small clerical sign error) in the original post as



          $$p_2=P(E_2)=(1-p)^3 + frac18p^3 + frac18p^3 = (1-p)^3 + frac14p^3 = 1-3p+3p^2-frac34p^3$$



          Events 2 and 3 are complementary, so we have



          $$p_3=P(E_3)=1-p_2=3p-3p^2+frac34p^3$$



          which also appears in the original post.



          In case of event 3, we ask for a retransmission and everything we did up to now is irrelevant: The retransmission will decide what happens (maybe involving even more retransmissions).



          So the decoded word is finally accepted when we get event 2 in any transmission; if this is also event 1, we decode correctly. Since the previous transmissions are not important now (they only influenced the fact that we got to this transmission, but they did not affect what happened in this transmission), the probability that we decode correctly ($p_{000}$) is that we get event 1 under the condition that we have gotten event 2:



          $$p_{000}=P(E_1|E_2) = frac{P(E_1 cap E_2)}{P(E_2)}= frac{P(E_1)}{P(E_2)}=frac{p_1}{p_2}=frac{(1-p)^3}{(1-p)^3 + frac14p^3}$$






          share|cite|improve this answer









          $endgroup$



          The calculation is correct, there is no problem with independece.



          Summing probabilities is correct if the events under consideration are disjoint: $P(X cup Y) = P(X)+P(Y)$ if $X$ and $Y$ cannot happen at the same time. In your case, $X$ is the event that we end after $x$ retransmissions and $Y$ that we end after $y$ retransmissions.



          Independence comes into play only when we don't have disjointess; the above formula becomes



          $$P(X cup Y) = P(X)+P(Y) - P(X cap Y)$$



          If we know that $X$ and $Y$ are independent, then we have $P(X cap Y)=P(X)P(Y)$.
          But this is not necessary here, disjoint events are even simpler as $P(X cap Y)=0$ since $X$ and $Y$ cannot both happen.



          ++++++++++++++++++



          I'll give another solution that avoids the inifinite sum by some 'clever' argument:



          In any given (re)transmission, there are 3 events relevant for the problem:



          $E_1$: We decode as "$000$",
          $E_2$: We decode as "$000$" or "$111$" or "$222$" (so a word from $C$),
          $E_3$: We decode as any other word (so a word not in $C$).



          The probability that event 1 happens is correctly calculated in the original post as



          $$p_1=P(E_1)=(1-p)^3$$



          The probability that event 2 happens is also correctly calculated (with a small clerical sign error) in the original post as



          $$p_2=P(E_2)=(1-p)^3 + frac18p^3 + frac18p^3 = (1-p)^3 + frac14p^3 = 1-3p+3p^2-frac34p^3$$



          Events 2 and 3 are complementary, so we have



          $$p_3=P(E_3)=1-p_2=3p-3p^2+frac34p^3$$



          which also appears in the original post.



          In case of event 3, we ask for a retransmission and everything we did up to now is irrelevant: The retransmission will decide what happens (maybe involving even more retransmissions).



          So the decoded word is finally accepted when we get event 2 in any transmission; if this is also event 1, we decode correctly. Since the previous transmissions are not important now (they only influenced the fact that we got to this transmission, but they did not affect what happened in this transmission), the probability that we decode correctly ($p_{000}$) is that we get event 1 under the condition that we have gotten event 2:



          $$p_{000}=P(E_1|E_2) = frac{P(E_1 cap E_2)}{P(E_2)}= frac{P(E_1)}{P(E_2)}=frac{p_1}{p_2}=frac{(1-p)^3}{(1-p)^3 + frac14p^3}$$







          share|cite|improve this answer












          share|cite|improve this answer



          share|cite|improve this answer










          answered Jan 27 at 23:19









          IngixIngix

          5,077159




          5,077159












          • $begingroup$
            You made it all very clear to me. I especially like your alternative and far more elegant solution. Thank you for your help.
            $endgroup$
            – Heinrich Wagner
            Jan 28 at 14:16


















          • $begingroup$
            You made it all very clear to me. I especially like your alternative and far more elegant solution. Thank you for your help.
            $endgroup$
            – Heinrich Wagner
            Jan 28 at 14:16
















          $begingroup$
          You made it all very clear to me. I especially like your alternative and far more elegant solution. Thank you for your help.
          $endgroup$
          – Heinrich Wagner
          Jan 28 at 14:16




          $begingroup$
          You made it all very clear to me. I especially like your alternative and far more elegant solution. Thank you for your help.
          $endgroup$
          – Heinrich Wagner
          Jan 28 at 14:16


















          draft saved

          draft discarded




















































          Thanks for contributing an answer to Mathematics Stack Exchange!


          • Please be sure to answer the question. Provide details and share your research!

          But avoid



          • Asking for help, clarification, or responding to other answers.

          • Making statements based on opinion; back them up with references or personal experience.


          Use MathJax to format equations. MathJax reference.


          To learn more, see our tips on writing great answers.




          draft saved


          draft discarded














          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3090133%2fwhat-is-the-probability-that-the-sent-codeword-will-be-eventually-decoded-correc%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown





















































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown

































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown







          Popular posts from this blog

          MongoDB - Not Authorized To Execute Command

          How to fix TextFormField cause rebuild widget in Flutter

          in spring boot 2.1 many test slices are not allowed anymore due to multiple @BootstrapWith