Optimal code for simple game












3












$begingroup$


Setup: Alice and Bob are playing a cooperative game. Alice chooses a number $y in {1, 2, 3, 4}$ uniformly at random. Bob doesn't observe $y$; his goal is to guess $y$. Alice can send Bob a message $z$ that contains at most 1 bit of information about $y$ (i.e., $I(z;y) = 1$).



Problem: How should Alice encode information about $y$ into her message $z$?





Potential Solutions: I have three ideas for what Alice should do, but they all give contradictory answers.




  1. If $y in {1, 2}$, Alice sends $z = 0$; otherwise, Alice sends $z = 1$. The code $z$ contains 1 bit of information. Bob will guess $y$ correctly with probability 0.5.

  2. With probability 1/2, Alice sends $z = y$ (2 bits); otherwise, Alice sends some null message (0 bits). Thus, Alice sends 1 bit in expectation. Bob will guess $y$ correctly in the first case; in the second case, he will guess randomly and be correct with probability 0.25. In total, Bob will guess $y$ correctly with probability $0.5 cdot 1.0 + 0.5 cdot 0.25 = 0.625$.

  3. Alice samples $z$ from the following 4-dimensional Categorical distribution that places probability 0.811 on $z = y$ and probability 0.063 on the other 3 atoms. The marginal $p(z)$ is uniform, so $H(z) = log_2(4) = 2$; the conditional $p(z mid y)$ has entropy
    $$H(z mid y) = 0.811 cdot log_2(frac{1}{0.811}) + 3 cdot 0.063 cdot log_2(frac{1}{0.063}) approx 1 $$
    The information content of Alice's message is $I(z;y) = H(z) - H(z mid y) = 1$. Bob's guess will be whatever message Alice sends, so he'll guess $y$ correctly with probability 0.811.










share|cite|improve this question









$endgroup$

















    3












    $begingroup$


    Setup: Alice and Bob are playing a cooperative game. Alice chooses a number $y in {1, 2, 3, 4}$ uniformly at random. Bob doesn't observe $y$; his goal is to guess $y$. Alice can send Bob a message $z$ that contains at most 1 bit of information about $y$ (i.e., $I(z;y) = 1$).



    Problem: How should Alice encode information about $y$ into her message $z$?





    Potential Solutions: I have three ideas for what Alice should do, but they all give contradictory answers.




    1. If $y in {1, 2}$, Alice sends $z = 0$; otherwise, Alice sends $z = 1$. The code $z$ contains 1 bit of information. Bob will guess $y$ correctly with probability 0.5.

    2. With probability 1/2, Alice sends $z = y$ (2 bits); otherwise, Alice sends some null message (0 bits). Thus, Alice sends 1 bit in expectation. Bob will guess $y$ correctly in the first case; in the second case, he will guess randomly and be correct with probability 0.25. In total, Bob will guess $y$ correctly with probability $0.5 cdot 1.0 + 0.5 cdot 0.25 = 0.625$.

    3. Alice samples $z$ from the following 4-dimensional Categorical distribution that places probability 0.811 on $z = y$ and probability 0.063 on the other 3 atoms. The marginal $p(z)$ is uniform, so $H(z) = log_2(4) = 2$; the conditional $p(z mid y)$ has entropy
      $$H(z mid y) = 0.811 cdot log_2(frac{1}{0.811}) + 3 cdot 0.063 cdot log_2(frac{1}{0.063}) approx 1 $$
      The information content of Alice's message is $I(z;y) = H(z) - H(z mid y) = 1$. Bob's guess will be whatever message Alice sends, so he'll guess $y$ correctly with probability 0.811.










    share|cite|improve this question









    $endgroup$















      3












      3








      3





      $begingroup$


      Setup: Alice and Bob are playing a cooperative game. Alice chooses a number $y in {1, 2, 3, 4}$ uniformly at random. Bob doesn't observe $y$; his goal is to guess $y$. Alice can send Bob a message $z$ that contains at most 1 bit of information about $y$ (i.e., $I(z;y) = 1$).



      Problem: How should Alice encode information about $y$ into her message $z$?





      Potential Solutions: I have three ideas for what Alice should do, but they all give contradictory answers.




      1. If $y in {1, 2}$, Alice sends $z = 0$; otherwise, Alice sends $z = 1$. The code $z$ contains 1 bit of information. Bob will guess $y$ correctly with probability 0.5.

      2. With probability 1/2, Alice sends $z = y$ (2 bits); otherwise, Alice sends some null message (0 bits). Thus, Alice sends 1 bit in expectation. Bob will guess $y$ correctly in the first case; in the second case, he will guess randomly and be correct with probability 0.25. In total, Bob will guess $y$ correctly with probability $0.5 cdot 1.0 + 0.5 cdot 0.25 = 0.625$.

      3. Alice samples $z$ from the following 4-dimensional Categorical distribution that places probability 0.811 on $z = y$ and probability 0.063 on the other 3 atoms. The marginal $p(z)$ is uniform, so $H(z) = log_2(4) = 2$; the conditional $p(z mid y)$ has entropy
        $$H(z mid y) = 0.811 cdot log_2(frac{1}{0.811}) + 3 cdot 0.063 cdot log_2(frac{1}{0.063}) approx 1 $$
        The information content of Alice's message is $I(z;y) = H(z) - H(z mid y) = 1$. Bob's guess will be whatever message Alice sends, so he'll guess $y$ correctly with probability 0.811.










      share|cite|improve this question









      $endgroup$




      Setup: Alice and Bob are playing a cooperative game. Alice chooses a number $y in {1, 2, 3, 4}$ uniformly at random. Bob doesn't observe $y$; his goal is to guess $y$. Alice can send Bob a message $z$ that contains at most 1 bit of information about $y$ (i.e., $I(z;y) = 1$).



      Problem: How should Alice encode information about $y$ into her message $z$?





      Potential Solutions: I have three ideas for what Alice should do, but they all give contradictory answers.




      1. If $y in {1, 2}$, Alice sends $z = 0$; otherwise, Alice sends $z = 1$. The code $z$ contains 1 bit of information. Bob will guess $y$ correctly with probability 0.5.

      2. With probability 1/2, Alice sends $z = y$ (2 bits); otherwise, Alice sends some null message (0 bits). Thus, Alice sends 1 bit in expectation. Bob will guess $y$ correctly in the first case; in the second case, he will guess randomly and be correct with probability 0.25. In total, Bob will guess $y$ correctly with probability $0.5 cdot 1.0 + 0.5 cdot 0.25 = 0.625$.

      3. Alice samples $z$ from the following 4-dimensional Categorical distribution that places probability 0.811 on $z = y$ and probability 0.063 on the other 3 atoms. The marginal $p(z)$ is uniform, so $H(z) = log_2(4) = 2$; the conditional $p(z mid y)$ has entropy
        $$H(z mid y) = 0.811 cdot log_2(frac{1}{0.811}) + 3 cdot 0.063 cdot log_2(frac{1}{0.063}) approx 1 $$
        The information content of Alice's message is $I(z;y) = H(z) - H(z mid y) = 1$. Bob's guess will be whatever message Alice sends, so he'll guess $y$ correctly with probability 0.811.







      probability information-theory coding-theory






      share|cite|improve this question













      share|cite|improve this question











      share|cite|improve this question




      share|cite|improve this question










      asked Jan 19 at 16:48









      BenBen

      1,04659




      1,04659






















          1 Answer
          1






          active

          oldest

          votes


















          2












          $begingroup$

          The problem statement is a little neater for me if put in the following way:



          Let $Y$ be uniform on $ {1, 2, 3, 4}$. We will guess $Y$ from a variable $Z$, i.e. $hat Y =g(Z)$, with $I(Y;Z)=1$ bit. The goal is minimizing the probability of error $p_e=P(hat Y ne Y)$. We want to find the optimal joint distribution for $Y,Z$ (in terms of channels: find the optimal channel with $Y$ as input and $Z$ as output), and the corresponding guess function $hat Y=g(Z)$.



          Your three answers are not "contradictory", they are just different (valid) proposals that give different results. It would be contradictory to assume that they are all optimal - at most the third one can be.



          To assert this, we recall Fano's inequality.



          In our scenario, we have $H(Z)=2 implies H(Z | Y)= H(Z)-I(Z;Y)=1$, so we get the bound
          $$ 1 le h(p_e) + p_e log(3) tag{1}$$
          where $h()$ is the binary entropy function. The critical value (which gives an equality) is $p^*_e = 0.18929cdots $. Then the probability of correct decoding cannot be greater than $ 1-p^*_e=0.81071cdots$.



          Your solution $3$ would correspond to a $4-$ary channel which has "crossover" probability $p$, so that, say $P(Z | Y= 1)= ( 1-p, frac{p}{3}, frac{p}{3}, frac{p}{3})$. Then, given that $Y$ is uniform, the conditional entropy would be



          $$begin{align}
          H(Z|Y) &= sum_i p(Y=i) H(Z|Y=i)\
          &= H(Z|Y=1)\
          &=-(1-p)log(1-p) - 3frac{p}{3}log(frac{p}{3}) \
          &=-(1-p)log(1-p) - p log(p) + p log(3) \
          &=h(p) + p log(3) \
          end{align}$$



          The value of $p$ that verifies $H(Z|Y)=1$ concides with that $(1)$. And, indeed, in this schema with guess $hat Y= Z$, so $p$ is also the probability of decoding error. Hence this schema attains the Fano bound, and it must be the optimal one.






          share|cite|improve this answer











          $endgroup$













            Your Answer





            StackExchange.ifUsing("editor", function () {
            return StackExchange.using("mathjaxEditing", function () {
            StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
            StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
            });
            });
            }, "mathjax-editing");

            StackExchange.ready(function() {
            var channelOptions = {
            tags: "".split(" "),
            id: "69"
            };
            initTagRenderer("".split(" "), "".split(" "), channelOptions);

            StackExchange.using("externalEditor", function() {
            // Have to fire editor after snippets, if snippets enabled
            if (StackExchange.settings.snippets.snippetsEnabled) {
            StackExchange.using("snippets", function() {
            createEditor();
            });
            }
            else {
            createEditor();
            }
            });

            function createEditor() {
            StackExchange.prepareEditor({
            heartbeatType: 'answer',
            autoActivateHeartbeat: false,
            convertImagesToLinks: true,
            noModals: true,
            showLowRepImageUploadWarning: true,
            reputationToPostImages: 10,
            bindNavPrevention: true,
            postfix: "",
            imageUploader: {
            brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
            contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
            allowUrls: true
            },
            noCode: true, onDemand: true,
            discardSelector: ".discard-answer"
            ,immediatelyShowMarkdownHelp:true
            });


            }
            });














            draft saved

            draft discarded


















            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3079553%2foptimal-code-for-simple-game%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown

























            1 Answer
            1






            active

            oldest

            votes








            1 Answer
            1






            active

            oldest

            votes









            active

            oldest

            votes






            active

            oldest

            votes









            2












            $begingroup$

            The problem statement is a little neater for me if put in the following way:



            Let $Y$ be uniform on $ {1, 2, 3, 4}$. We will guess $Y$ from a variable $Z$, i.e. $hat Y =g(Z)$, with $I(Y;Z)=1$ bit. The goal is minimizing the probability of error $p_e=P(hat Y ne Y)$. We want to find the optimal joint distribution for $Y,Z$ (in terms of channels: find the optimal channel with $Y$ as input and $Z$ as output), and the corresponding guess function $hat Y=g(Z)$.



            Your three answers are not "contradictory", they are just different (valid) proposals that give different results. It would be contradictory to assume that they are all optimal - at most the third one can be.



            To assert this, we recall Fano's inequality.



            In our scenario, we have $H(Z)=2 implies H(Z | Y)= H(Z)-I(Z;Y)=1$, so we get the bound
            $$ 1 le h(p_e) + p_e log(3) tag{1}$$
            where $h()$ is the binary entropy function. The critical value (which gives an equality) is $p^*_e = 0.18929cdots $. Then the probability of correct decoding cannot be greater than $ 1-p^*_e=0.81071cdots$.



            Your solution $3$ would correspond to a $4-$ary channel which has "crossover" probability $p$, so that, say $P(Z | Y= 1)= ( 1-p, frac{p}{3}, frac{p}{3}, frac{p}{3})$. Then, given that $Y$ is uniform, the conditional entropy would be



            $$begin{align}
            H(Z|Y) &= sum_i p(Y=i) H(Z|Y=i)\
            &= H(Z|Y=1)\
            &=-(1-p)log(1-p) - 3frac{p}{3}log(frac{p}{3}) \
            &=-(1-p)log(1-p) - p log(p) + p log(3) \
            &=h(p) + p log(3) \
            end{align}$$



            The value of $p$ that verifies $H(Z|Y)=1$ concides with that $(1)$. And, indeed, in this schema with guess $hat Y= Z$, so $p$ is also the probability of decoding error. Hence this schema attains the Fano bound, and it must be the optimal one.






            share|cite|improve this answer











            $endgroup$


















              2












              $begingroup$

              The problem statement is a little neater for me if put in the following way:



              Let $Y$ be uniform on $ {1, 2, 3, 4}$. We will guess $Y$ from a variable $Z$, i.e. $hat Y =g(Z)$, with $I(Y;Z)=1$ bit. The goal is minimizing the probability of error $p_e=P(hat Y ne Y)$. We want to find the optimal joint distribution for $Y,Z$ (in terms of channels: find the optimal channel with $Y$ as input and $Z$ as output), and the corresponding guess function $hat Y=g(Z)$.



              Your three answers are not "contradictory", they are just different (valid) proposals that give different results. It would be contradictory to assume that they are all optimal - at most the third one can be.



              To assert this, we recall Fano's inequality.



              In our scenario, we have $H(Z)=2 implies H(Z | Y)= H(Z)-I(Z;Y)=1$, so we get the bound
              $$ 1 le h(p_e) + p_e log(3) tag{1}$$
              where $h()$ is the binary entropy function. The critical value (which gives an equality) is $p^*_e = 0.18929cdots $. Then the probability of correct decoding cannot be greater than $ 1-p^*_e=0.81071cdots$.



              Your solution $3$ would correspond to a $4-$ary channel which has "crossover" probability $p$, so that, say $P(Z | Y= 1)= ( 1-p, frac{p}{3}, frac{p}{3}, frac{p}{3})$. Then, given that $Y$ is uniform, the conditional entropy would be



              $$begin{align}
              H(Z|Y) &= sum_i p(Y=i) H(Z|Y=i)\
              &= H(Z|Y=1)\
              &=-(1-p)log(1-p) - 3frac{p}{3}log(frac{p}{3}) \
              &=-(1-p)log(1-p) - p log(p) + p log(3) \
              &=h(p) + p log(3) \
              end{align}$$



              The value of $p$ that verifies $H(Z|Y)=1$ concides with that $(1)$. And, indeed, in this schema with guess $hat Y= Z$, so $p$ is also the probability of decoding error. Hence this schema attains the Fano bound, and it must be the optimal one.






              share|cite|improve this answer











              $endgroup$
















                2












                2








                2





                $begingroup$

                The problem statement is a little neater for me if put in the following way:



                Let $Y$ be uniform on $ {1, 2, 3, 4}$. We will guess $Y$ from a variable $Z$, i.e. $hat Y =g(Z)$, with $I(Y;Z)=1$ bit. The goal is minimizing the probability of error $p_e=P(hat Y ne Y)$. We want to find the optimal joint distribution for $Y,Z$ (in terms of channels: find the optimal channel with $Y$ as input and $Z$ as output), and the corresponding guess function $hat Y=g(Z)$.



                Your three answers are not "contradictory", they are just different (valid) proposals that give different results. It would be contradictory to assume that they are all optimal - at most the third one can be.



                To assert this, we recall Fano's inequality.



                In our scenario, we have $H(Z)=2 implies H(Z | Y)= H(Z)-I(Z;Y)=1$, so we get the bound
                $$ 1 le h(p_e) + p_e log(3) tag{1}$$
                where $h()$ is the binary entropy function. The critical value (which gives an equality) is $p^*_e = 0.18929cdots $. Then the probability of correct decoding cannot be greater than $ 1-p^*_e=0.81071cdots$.



                Your solution $3$ would correspond to a $4-$ary channel which has "crossover" probability $p$, so that, say $P(Z | Y= 1)= ( 1-p, frac{p}{3}, frac{p}{3}, frac{p}{3})$. Then, given that $Y$ is uniform, the conditional entropy would be



                $$begin{align}
                H(Z|Y) &= sum_i p(Y=i) H(Z|Y=i)\
                &= H(Z|Y=1)\
                &=-(1-p)log(1-p) - 3frac{p}{3}log(frac{p}{3}) \
                &=-(1-p)log(1-p) - p log(p) + p log(3) \
                &=h(p) + p log(3) \
                end{align}$$



                The value of $p$ that verifies $H(Z|Y)=1$ concides with that $(1)$. And, indeed, in this schema with guess $hat Y= Z$, so $p$ is also the probability of decoding error. Hence this schema attains the Fano bound, and it must be the optimal one.






                share|cite|improve this answer











                $endgroup$



                The problem statement is a little neater for me if put in the following way:



                Let $Y$ be uniform on $ {1, 2, 3, 4}$. We will guess $Y$ from a variable $Z$, i.e. $hat Y =g(Z)$, with $I(Y;Z)=1$ bit. The goal is minimizing the probability of error $p_e=P(hat Y ne Y)$. We want to find the optimal joint distribution for $Y,Z$ (in terms of channels: find the optimal channel with $Y$ as input and $Z$ as output), and the corresponding guess function $hat Y=g(Z)$.



                Your three answers are not "contradictory", they are just different (valid) proposals that give different results. It would be contradictory to assume that they are all optimal - at most the third one can be.



                To assert this, we recall Fano's inequality.



                In our scenario, we have $H(Z)=2 implies H(Z | Y)= H(Z)-I(Z;Y)=1$, so we get the bound
                $$ 1 le h(p_e) + p_e log(3) tag{1}$$
                where $h()$ is the binary entropy function. The critical value (which gives an equality) is $p^*_e = 0.18929cdots $. Then the probability of correct decoding cannot be greater than $ 1-p^*_e=0.81071cdots$.



                Your solution $3$ would correspond to a $4-$ary channel which has "crossover" probability $p$, so that, say $P(Z | Y= 1)= ( 1-p, frac{p}{3}, frac{p}{3}, frac{p}{3})$. Then, given that $Y$ is uniform, the conditional entropy would be



                $$begin{align}
                H(Z|Y) &= sum_i p(Y=i) H(Z|Y=i)\
                &= H(Z|Y=1)\
                &=-(1-p)log(1-p) - 3frac{p}{3}log(frac{p}{3}) \
                &=-(1-p)log(1-p) - p log(p) + p log(3) \
                &=h(p) + p log(3) \
                end{align}$$



                The value of $p$ that verifies $H(Z|Y)=1$ concides with that $(1)$. And, indeed, in this schema with guess $hat Y= Z$, so $p$ is also the probability of decoding error. Hence this schema attains the Fano bound, and it must be the optimal one.







                share|cite|improve this answer














                share|cite|improve this answer



                share|cite|improve this answer








                edited Jan 21 at 18:10

























                answered Jan 19 at 20:02









                leonbloyleonbloy

                41.4k645107




                41.4k645107






























                    draft saved

                    draft discarded




















































                    Thanks for contributing an answer to Mathematics Stack Exchange!


                    • Please be sure to answer the question. Provide details and share your research!

                    But avoid



                    • Asking for help, clarification, or responding to other answers.

                    • Making statements based on opinion; back them up with references or personal experience.


                    Use MathJax to format equations. MathJax reference.


                    To learn more, see our tips on writing great answers.




                    draft saved


                    draft discarded














                    StackExchange.ready(
                    function () {
                    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3079553%2foptimal-code-for-simple-game%23new-answer', 'question_page');
                    }
                    );

                    Post as a guest















                    Required, but never shown





















































                    Required, but never shown














                    Required, but never shown












                    Required, but never shown







                    Required, but never shown

































                    Required, but never shown














                    Required, but never shown












                    Required, but never shown







                    Required, but never shown







                    Popular posts from this blog

                    MongoDB - Not Authorized To Execute Command

                    How to fix TextFormField cause rebuild widget in Flutter

                    Npm cannot find a required file even through it is in the searched directory