Is the Renyi entropy an invertible functional of ordered probability distributions?












3












$begingroup$


The Renyi entropy assigns a real number $H_{vec{p}} (alpha)$ to a probability distribution $vec{p} = (p_1, p_2, ldots)$ as a function of an order parameter $alpha > 0, alpha neq 1$:



$$H_{vec{p}} (alpha) equiv frac{1}{1-alpha}logBigg(sum_i p_i^alphaBigg).$$



When considered as a function of $alpha$, this has been called the "spectrum of Renyi information". The Renyi entropies are notable for being non-increasing under doubly stochastic operations ($vec{p} to Tvec{p}$ for doubly stochastic matrix $T$) and for recovering the Shannon entropy as $alpha to 1$.



Let us ignore the order of the $p_i$, e.g., by focusing on the set of distributions where the probabilities are in non-increasing order. If we take the Renyi entropy to map such distributions to functions of $alpha$,
$$ vec{p} to H_{vec{p}} (alpha),$$
is this map invertible? If so, does the inverse have a closed form?



In other words, must two discrete probability distributions be permutations of each other if they have the same Renyi entropy at all orders? Commenter Mike Earnest suspects so. And indeed, since large values of $alpha$ "pick out" only the very largest probabilities, while smaller values are progressively more sensitive to further down the distribution, it seems likely to be true. This is essentially asking whether probability distributions are uniquely determined by their $alpha$-norm $|p_i|_alpha equiv left(sum_i p_i^alpharight)^{1/alpha}$ to all orders.



Bonus question: if we generalize to continuous probability distributions, will the same be true up to a set of measure zero?










share|cite|improve this question











$endgroup$

















    3












    $begingroup$


    The Renyi entropy assigns a real number $H_{vec{p}} (alpha)$ to a probability distribution $vec{p} = (p_1, p_2, ldots)$ as a function of an order parameter $alpha > 0, alpha neq 1$:



    $$H_{vec{p}} (alpha) equiv frac{1}{1-alpha}logBigg(sum_i p_i^alphaBigg).$$



    When considered as a function of $alpha$, this has been called the "spectrum of Renyi information". The Renyi entropies are notable for being non-increasing under doubly stochastic operations ($vec{p} to Tvec{p}$ for doubly stochastic matrix $T$) and for recovering the Shannon entropy as $alpha to 1$.



    Let us ignore the order of the $p_i$, e.g., by focusing on the set of distributions where the probabilities are in non-increasing order. If we take the Renyi entropy to map such distributions to functions of $alpha$,
    $$ vec{p} to H_{vec{p}} (alpha),$$
    is this map invertible? If so, does the inverse have a closed form?



    In other words, must two discrete probability distributions be permutations of each other if they have the same Renyi entropy at all orders? Commenter Mike Earnest suspects so. And indeed, since large values of $alpha$ "pick out" only the very largest probabilities, while smaller values are progressively more sensitive to further down the distribution, it seems likely to be true. This is essentially asking whether probability distributions are uniquely determined by their $alpha$-norm $|p_i|_alpha equiv left(sum_i p_i^alpharight)^{1/alpha}$ to all orders.



    Bonus question: if we generalize to continuous probability distributions, will the same be true up to a set of measure zero?










    share|cite|improve this question











    $endgroup$















      3












      3








      3


      1



      $begingroup$


      The Renyi entropy assigns a real number $H_{vec{p}} (alpha)$ to a probability distribution $vec{p} = (p_1, p_2, ldots)$ as a function of an order parameter $alpha > 0, alpha neq 1$:



      $$H_{vec{p}} (alpha) equiv frac{1}{1-alpha}logBigg(sum_i p_i^alphaBigg).$$



      When considered as a function of $alpha$, this has been called the "spectrum of Renyi information". The Renyi entropies are notable for being non-increasing under doubly stochastic operations ($vec{p} to Tvec{p}$ for doubly stochastic matrix $T$) and for recovering the Shannon entropy as $alpha to 1$.



      Let us ignore the order of the $p_i$, e.g., by focusing on the set of distributions where the probabilities are in non-increasing order. If we take the Renyi entropy to map such distributions to functions of $alpha$,
      $$ vec{p} to H_{vec{p}} (alpha),$$
      is this map invertible? If so, does the inverse have a closed form?



      In other words, must two discrete probability distributions be permutations of each other if they have the same Renyi entropy at all orders? Commenter Mike Earnest suspects so. And indeed, since large values of $alpha$ "pick out" only the very largest probabilities, while smaller values are progressively more sensitive to further down the distribution, it seems likely to be true. This is essentially asking whether probability distributions are uniquely determined by their $alpha$-norm $|p_i|_alpha equiv left(sum_i p_i^alpharight)^{1/alpha}$ to all orders.



      Bonus question: if we generalize to continuous probability distributions, will the same be true up to a set of measure zero?










      share|cite|improve this question











      $endgroup$




      The Renyi entropy assigns a real number $H_{vec{p}} (alpha)$ to a probability distribution $vec{p} = (p_1, p_2, ldots)$ as a function of an order parameter $alpha > 0, alpha neq 1$:



      $$H_{vec{p}} (alpha) equiv frac{1}{1-alpha}logBigg(sum_i p_i^alphaBigg).$$



      When considered as a function of $alpha$, this has been called the "spectrum of Renyi information". The Renyi entropies are notable for being non-increasing under doubly stochastic operations ($vec{p} to Tvec{p}$ for doubly stochastic matrix $T$) and for recovering the Shannon entropy as $alpha to 1$.



      Let us ignore the order of the $p_i$, e.g., by focusing on the set of distributions where the probabilities are in non-increasing order. If we take the Renyi entropy to map such distributions to functions of $alpha$,
      $$ vec{p} to H_{vec{p}} (alpha),$$
      is this map invertible? If so, does the inverse have a closed form?



      In other words, must two discrete probability distributions be permutations of each other if they have the same Renyi entropy at all orders? Commenter Mike Earnest suspects so. And indeed, since large values of $alpha$ "pick out" only the very largest probabilities, while smaller values are progressively more sensitive to further down the distribution, it seems likely to be true. This is essentially asking whether probability distributions are uniquely determined by their $alpha$-norm $|p_i|_alpha equiv left(sum_i p_i^alpharight)^{1/alpha}$ to all orders.



      Bonus question: if we generalize to continuous probability distributions, will the same be true up to a set of measure zero?







      probability-distributions entropy






      share|cite|improve this question















      share|cite|improve this question













      share|cite|improve this question




      share|cite|improve this question








      edited Dec 29 '18 at 19:50







      Jess Riedel

















      asked Dec 29 '18 at 18:14









      Jess RiedelJess Riedel

      1408




      1408






















          1 Answer
          1






          active

          oldest

          votes


















          2












          $begingroup$

          The answer to this question is "yes" in the case of discrete probabilities. I will gesture toward the solution with a physics-y argument but not give anything rigorous. Anyone who wants to contribute something better is strongly encouraged to do so.



          Let $f(alpha) equiv sum_i p_i^alpha = |p_i|_alpha^alpha = expleft[(1-alpha)H_{vec p}(alpha)right]$ for $alpha > 0, neq 1$. Analytically continuing to the complex plane, this looks like a discrete Fourier transform of a "support" function on the probabilities: $f(it) = sum_i e^{i p_i t}$. We can apply the inverse Fourier transform to get that support function: $$tilde{f}(omega) equiv int_{-infty}^infty ! dt , f(it) e^{-i omega t} = sum_i int_{-infty}^infty ! dt ,e^{i(p_i -omega) t} = sum_i delta(p_i-omega)$$ where $delta$ represents the Diract delta function (distribution).



          Therefore, it ought to be possible to read off the probabilities, including the degree of any degeneracies, by computing $tilde{f}(omega)$ and finding the peaks. Given the nature of the discrete Fourier transform and analytic continuation, it's presumably only necessary to know $f(alpha)$ for a countable set.






          share|cite|improve this answer









          $endgroup$













          • $begingroup$
            Source: I heard this from Dan Ranard.
            $endgroup$
            – Jess Riedel
            Jan 10 at 15:39











          Your Answer





          StackExchange.ifUsing("editor", function () {
          return StackExchange.using("mathjaxEditing", function () {
          StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
          StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
          });
          });
          }, "mathjax-editing");

          StackExchange.ready(function() {
          var channelOptions = {
          tags: "".split(" "),
          id: "69"
          };
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function() {
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled) {
          StackExchange.using("snippets", function() {
          createEditor();
          });
          }
          else {
          createEditor();
          }
          });

          function createEditor() {
          StackExchange.prepareEditor({
          heartbeatType: 'answer',
          autoActivateHeartbeat: false,
          convertImagesToLinks: true,
          noModals: true,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: 10,
          bindNavPrevention: true,
          postfix: "",
          imageUploader: {
          brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
          contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
          allowUrls: true
          },
          noCode: true, onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          });


          }
          });














          draft saved

          draft discarded


















          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3056104%2fis-the-renyi-entropy-an-invertible-functional-of-ordered-probability-distributio%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown

























          1 Answer
          1






          active

          oldest

          votes








          1 Answer
          1






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes









          2












          $begingroup$

          The answer to this question is "yes" in the case of discrete probabilities. I will gesture toward the solution with a physics-y argument but not give anything rigorous. Anyone who wants to contribute something better is strongly encouraged to do so.



          Let $f(alpha) equiv sum_i p_i^alpha = |p_i|_alpha^alpha = expleft[(1-alpha)H_{vec p}(alpha)right]$ for $alpha > 0, neq 1$. Analytically continuing to the complex plane, this looks like a discrete Fourier transform of a "support" function on the probabilities: $f(it) = sum_i e^{i p_i t}$. We can apply the inverse Fourier transform to get that support function: $$tilde{f}(omega) equiv int_{-infty}^infty ! dt , f(it) e^{-i omega t} = sum_i int_{-infty}^infty ! dt ,e^{i(p_i -omega) t} = sum_i delta(p_i-omega)$$ where $delta$ represents the Diract delta function (distribution).



          Therefore, it ought to be possible to read off the probabilities, including the degree of any degeneracies, by computing $tilde{f}(omega)$ and finding the peaks. Given the nature of the discrete Fourier transform and analytic continuation, it's presumably only necessary to know $f(alpha)$ for a countable set.






          share|cite|improve this answer









          $endgroup$













          • $begingroup$
            Source: I heard this from Dan Ranard.
            $endgroup$
            – Jess Riedel
            Jan 10 at 15:39
















          2












          $begingroup$

          The answer to this question is "yes" in the case of discrete probabilities. I will gesture toward the solution with a physics-y argument but not give anything rigorous. Anyone who wants to contribute something better is strongly encouraged to do so.



          Let $f(alpha) equiv sum_i p_i^alpha = |p_i|_alpha^alpha = expleft[(1-alpha)H_{vec p}(alpha)right]$ for $alpha > 0, neq 1$. Analytically continuing to the complex plane, this looks like a discrete Fourier transform of a "support" function on the probabilities: $f(it) = sum_i e^{i p_i t}$. We can apply the inverse Fourier transform to get that support function: $$tilde{f}(omega) equiv int_{-infty}^infty ! dt , f(it) e^{-i omega t} = sum_i int_{-infty}^infty ! dt ,e^{i(p_i -omega) t} = sum_i delta(p_i-omega)$$ where $delta$ represents the Diract delta function (distribution).



          Therefore, it ought to be possible to read off the probabilities, including the degree of any degeneracies, by computing $tilde{f}(omega)$ and finding the peaks. Given the nature of the discrete Fourier transform and analytic continuation, it's presumably only necessary to know $f(alpha)$ for a countable set.






          share|cite|improve this answer









          $endgroup$













          • $begingroup$
            Source: I heard this from Dan Ranard.
            $endgroup$
            – Jess Riedel
            Jan 10 at 15:39














          2












          2








          2





          $begingroup$

          The answer to this question is "yes" in the case of discrete probabilities. I will gesture toward the solution with a physics-y argument but not give anything rigorous. Anyone who wants to contribute something better is strongly encouraged to do so.



          Let $f(alpha) equiv sum_i p_i^alpha = |p_i|_alpha^alpha = expleft[(1-alpha)H_{vec p}(alpha)right]$ for $alpha > 0, neq 1$. Analytically continuing to the complex plane, this looks like a discrete Fourier transform of a "support" function on the probabilities: $f(it) = sum_i e^{i p_i t}$. We can apply the inverse Fourier transform to get that support function: $$tilde{f}(omega) equiv int_{-infty}^infty ! dt , f(it) e^{-i omega t} = sum_i int_{-infty}^infty ! dt ,e^{i(p_i -omega) t} = sum_i delta(p_i-omega)$$ where $delta$ represents the Diract delta function (distribution).



          Therefore, it ought to be possible to read off the probabilities, including the degree of any degeneracies, by computing $tilde{f}(omega)$ and finding the peaks. Given the nature of the discrete Fourier transform and analytic continuation, it's presumably only necessary to know $f(alpha)$ for a countable set.






          share|cite|improve this answer









          $endgroup$



          The answer to this question is "yes" in the case of discrete probabilities. I will gesture toward the solution with a physics-y argument but not give anything rigorous. Anyone who wants to contribute something better is strongly encouraged to do so.



          Let $f(alpha) equiv sum_i p_i^alpha = |p_i|_alpha^alpha = expleft[(1-alpha)H_{vec p}(alpha)right]$ for $alpha > 0, neq 1$. Analytically continuing to the complex plane, this looks like a discrete Fourier transform of a "support" function on the probabilities: $f(it) = sum_i e^{i p_i t}$. We can apply the inverse Fourier transform to get that support function: $$tilde{f}(omega) equiv int_{-infty}^infty ! dt , f(it) e^{-i omega t} = sum_i int_{-infty}^infty ! dt ,e^{i(p_i -omega) t} = sum_i delta(p_i-omega)$$ where $delta$ represents the Diract delta function (distribution).



          Therefore, it ought to be possible to read off the probabilities, including the degree of any degeneracies, by computing $tilde{f}(omega)$ and finding the peaks. Given the nature of the discrete Fourier transform and analytic continuation, it's presumably only necessary to know $f(alpha)$ for a countable set.







          share|cite|improve this answer












          share|cite|improve this answer



          share|cite|improve this answer










          answered Jan 10 at 15:38









          Jess RiedelJess Riedel

          1408




          1408












          • $begingroup$
            Source: I heard this from Dan Ranard.
            $endgroup$
            – Jess Riedel
            Jan 10 at 15:39


















          • $begingroup$
            Source: I heard this from Dan Ranard.
            $endgroup$
            – Jess Riedel
            Jan 10 at 15:39
















          $begingroup$
          Source: I heard this from Dan Ranard.
          $endgroup$
          – Jess Riedel
          Jan 10 at 15:39




          $begingroup$
          Source: I heard this from Dan Ranard.
          $endgroup$
          – Jess Riedel
          Jan 10 at 15:39


















          draft saved

          draft discarded




















































          Thanks for contributing an answer to Mathematics Stack Exchange!


          • Please be sure to answer the question. Provide details and share your research!

          But avoid



          • Asking for help, clarification, or responding to other answers.

          • Making statements based on opinion; back them up with references or personal experience.


          Use MathJax to format equations. MathJax reference.


          To learn more, see our tips on writing great answers.




          draft saved


          draft discarded














          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3056104%2fis-the-renyi-entropy-an-invertible-functional-of-ordered-probability-distributio%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown





















































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown

































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown







          Popular posts from this blog

          MongoDB - Not Authorized To Execute Command

          Npm cannot find a required file even through it is in the searched directory

          in spring boot 2.1 many test slices are not allowed anymore due to multiple @BootstrapWith