A simple inequality involving the entropy function $H(x)$.
$begingroup$
I came across the following simple inequality when I was working on a proof. I would like to know if it's a known inequality and used anywhere.
$H(x) > (ln x)times (ln (1-x)), 0 < x < 1.$
The proof goes as follows:
begin{eqnarray}
H(x)&=&-xln x -(1-x)times ln(1-x)\
&=& -x ln x + (x-1)times ln (1-x),
end{eqnarray}
and since $-x ln x > 0, 0 < x < 1,$ and $x-1 ge ln x, forall x >0,$
begin{eqnarray}
H(x)&ge& (x-1)times ln (1-x)\
&ge &(ln x)times (ln (1-x)).
end{eqnarray}
Any comments would be welcome.
inequality entropy
$endgroup$
add a comment |
$begingroup$
I came across the following simple inequality when I was working on a proof. I would like to know if it's a known inequality and used anywhere.
$H(x) > (ln x)times (ln (1-x)), 0 < x < 1.$
The proof goes as follows:
begin{eqnarray}
H(x)&=&-xln x -(1-x)times ln(1-x)\
&=& -x ln x + (x-1)times ln (1-x),
end{eqnarray}
and since $-x ln x > 0, 0 < x < 1,$ and $x-1 ge ln x, forall x >0,$
begin{eqnarray}
H(x)&ge& (x-1)times ln (1-x)\
&ge &(ln x)times (ln (1-x)).
end{eqnarray}
Any comments would be welcome.
inequality entropy
$endgroup$
$begingroup$
The inequality does not hold (simple to check). However, plotting $ln (x) ln(1-x)$ indicates that it is actually a good approximation (not bound) of the entropy function.
$endgroup$
– Stelios
Dec 24 '18 at 21:40
$begingroup$
Can you please elaborate what you mean by a simple check? Tx.
$endgroup$
– AYO
Dec 24 '18 at 23:14
2
$begingroup$
$-2>-3$ does not imply $-2times-5>-3times-5$. Similarly here as $ln(1-x)<0$, multiplying by $ln x$ does not give a smaller number than multiplying by $x-1$, (both being negative).
$endgroup$
– Macavity
Dec 25 '18 at 3:08
$begingroup$
That makes sense—Tx. I've fixed the proof and will post it after I go over it.
$endgroup$
– AYO
Dec 26 '18 at 10:16
add a comment |
$begingroup$
I came across the following simple inequality when I was working on a proof. I would like to know if it's a known inequality and used anywhere.
$H(x) > (ln x)times (ln (1-x)), 0 < x < 1.$
The proof goes as follows:
begin{eqnarray}
H(x)&=&-xln x -(1-x)times ln(1-x)\
&=& -x ln x + (x-1)times ln (1-x),
end{eqnarray}
and since $-x ln x > 0, 0 < x < 1,$ and $x-1 ge ln x, forall x >0,$
begin{eqnarray}
H(x)&ge& (x-1)times ln (1-x)\
&ge &(ln x)times (ln (1-x)).
end{eqnarray}
Any comments would be welcome.
inequality entropy
$endgroup$
I came across the following simple inequality when I was working on a proof. I would like to know if it's a known inequality and used anywhere.
$H(x) > (ln x)times (ln (1-x)), 0 < x < 1.$
The proof goes as follows:
begin{eqnarray}
H(x)&=&-xln x -(1-x)times ln(1-x)\
&=& -x ln x + (x-1)times ln (1-x),
end{eqnarray}
and since $-x ln x > 0, 0 < x < 1,$ and $x-1 ge ln x, forall x >0,$
begin{eqnarray}
H(x)&ge& (x-1)times ln (1-x)\
&ge &(ln x)times (ln (1-x)).
end{eqnarray}
Any comments would be welcome.
inequality entropy
inequality entropy
asked Dec 24 '18 at 20:54
AYOAYO
462
462
$begingroup$
The inequality does not hold (simple to check). However, plotting $ln (x) ln(1-x)$ indicates that it is actually a good approximation (not bound) of the entropy function.
$endgroup$
– Stelios
Dec 24 '18 at 21:40
$begingroup$
Can you please elaborate what you mean by a simple check? Tx.
$endgroup$
– AYO
Dec 24 '18 at 23:14
2
$begingroup$
$-2>-3$ does not imply $-2times-5>-3times-5$. Similarly here as $ln(1-x)<0$, multiplying by $ln x$ does not give a smaller number than multiplying by $x-1$, (both being negative).
$endgroup$
– Macavity
Dec 25 '18 at 3:08
$begingroup$
That makes sense—Tx. I've fixed the proof and will post it after I go over it.
$endgroup$
– AYO
Dec 26 '18 at 10:16
add a comment |
$begingroup$
The inequality does not hold (simple to check). However, plotting $ln (x) ln(1-x)$ indicates that it is actually a good approximation (not bound) of the entropy function.
$endgroup$
– Stelios
Dec 24 '18 at 21:40
$begingroup$
Can you please elaborate what you mean by a simple check? Tx.
$endgroup$
– AYO
Dec 24 '18 at 23:14
2
$begingroup$
$-2>-3$ does not imply $-2times-5>-3times-5$. Similarly here as $ln(1-x)<0$, multiplying by $ln x$ does not give a smaller number than multiplying by $x-1$, (both being negative).
$endgroup$
– Macavity
Dec 25 '18 at 3:08
$begingroup$
That makes sense—Tx. I've fixed the proof and will post it after I go over it.
$endgroup$
– AYO
Dec 26 '18 at 10:16
$begingroup$
The inequality does not hold (simple to check). However, plotting $ln (x) ln(1-x)$ indicates that it is actually a good approximation (not bound) of the entropy function.
$endgroup$
– Stelios
Dec 24 '18 at 21:40
$begingroup$
The inequality does not hold (simple to check). However, plotting $ln (x) ln(1-x)$ indicates that it is actually a good approximation (not bound) of the entropy function.
$endgroup$
– Stelios
Dec 24 '18 at 21:40
$begingroup$
Can you please elaborate what you mean by a simple check? Tx.
$endgroup$
– AYO
Dec 24 '18 at 23:14
$begingroup$
Can you please elaborate what you mean by a simple check? Tx.
$endgroup$
– AYO
Dec 24 '18 at 23:14
2
2
$begingroup$
$-2>-3$ does not imply $-2times-5>-3times-5$. Similarly here as $ln(1-x)<0$, multiplying by $ln x$ does not give a smaller number than multiplying by $x-1$, (both being negative).
$endgroup$
– Macavity
Dec 25 '18 at 3:08
$begingroup$
$-2>-3$ does not imply $-2times-5>-3times-5$. Similarly here as $ln(1-x)<0$, multiplying by $ln x$ does not give a smaller number than multiplying by $x-1$, (both being negative).
$endgroup$
– Macavity
Dec 25 '18 at 3:08
$begingroup$
That makes sense—Tx. I've fixed the proof and will post it after I go over it.
$endgroup$
– AYO
Dec 26 '18 at 10:16
$begingroup$
That makes sense—Tx. I've fixed the proof and will post it after I go over it.
$endgroup$
– AYO
Dec 26 '18 at 10:16
add a comment |
1 Answer
1
active
oldest
votes
$begingroup$
It turns out that this inequality was proved in the following article:
M Bahramgiri and O Naghshineh Arjomand. A simple proof of the entropy inequality. RGMIA research report collection, 3(4), 2000.
The same article also provides an upper bound.
Here is the 2-sided inequality that is proved in this article.
$(ln x)(ln (1-x)) le H(x) le (ln x)(ln (1-x))/ln 2, 0 < x < 1.$
A more elementary proof of the left inequality is provided as follows:
Writing out the entropy function and dividing both sides of the desired inequality by $(ln x)(ln (1-x)),$ (which is positive), gives
$1 le frac{-x}{ln (1-x)} + frac{-(1-x)}{ln x}, 0 < x < 1.,,, (1)$
Using the logarithmic inequality $ln y > (y-1)/sqrt{y};, 0 < y < 1$ with $y=x$ and $y= 1-x$ implies
$sqrt{x} +sqrt{1-x} < frac{-x}{ln (1-x)} + frac{-(1-x)}{ln x},,, (2)$.
Now, it is not difficult to verify that $1le sqrt{x} +sqrt{1-x}, 0 < x < 1$ and this together with $(2)$ implies $(1),$ which then implies the desired entropy inequality.
$endgroup$
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3051628%2fa-simple-inequality-involving-the-entropy-function-hx%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
It turns out that this inequality was proved in the following article:
M Bahramgiri and O Naghshineh Arjomand. A simple proof of the entropy inequality. RGMIA research report collection, 3(4), 2000.
The same article also provides an upper bound.
Here is the 2-sided inequality that is proved in this article.
$(ln x)(ln (1-x)) le H(x) le (ln x)(ln (1-x))/ln 2, 0 < x < 1.$
A more elementary proof of the left inequality is provided as follows:
Writing out the entropy function and dividing both sides of the desired inequality by $(ln x)(ln (1-x)),$ (which is positive), gives
$1 le frac{-x}{ln (1-x)} + frac{-(1-x)}{ln x}, 0 < x < 1.,,, (1)$
Using the logarithmic inequality $ln y > (y-1)/sqrt{y};, 0 < y < 1$ with $y=x$ and $y= 1-x$ implies
$sqrt{x} +sqrt{1-x} < frac{-x}{ln (1-x)} + frac{-(1-x)}{ln x},,, (2)$.
Now, it is not difficult to verify that $1le sqrt{x} +sqrt{1-x}, 0 < x < 1$ and this together with $(2)$ implies $(1),$ which then implies the desired entropy inequality.
$endgroup$
add a comment |
$begingroup$
It turns out that this inequality was proved in the following article:
M Bahramgiri and O Naghshineh Arjomand. A simple proof of the entropy inequality. RGMIA research report collection, 3(4), 2000.
The same article also provides an upper bound.
Here is the 2-sided inequality that is proved in this article.
$(ln x)(ln (1-x)) le H(x) le (ln x)(ln (1-x))/ln 2, 0 < x < 1.$
A more elementary proof of the left inequality is provided as follows:
Writing out the entropy function and dividing both sides of the desired inequality by $(ln x)(ln (1-x)),$ (which is positive), gives
$1 le frac{-x}{ln (1-x)} + frac{-(1-x)}{ln x}, 0 < x < 1.,,, (1)$
Using the logarithmic inequality $ln y > (y-1)/sqrt{y};, 0 < y < 1$ with $y=x$ and $y= 1-x$ implies
$sqrt{x} +sqrt{1-x} < frac{-x}{ln (1-x)} + frac{-(1-x)}{ln x},,, (2)$.
Now, it is not difficult to verify that $1le sqrt{x} +sqrt{1-x}, 0 < x < 1$ and this together with $(2)$ implies $(1),$ which then implies the desired entropy inequality.
$endgroup$
add a comment |
$begingroup$
It turns out that this inequality was proved in the following article:
M Bahramgiri and O Naghshineh Arjomand. A simple proof of the entropy inequality. RGMIA research report collection, 3(4), 2000.
The same article also provides an upper bound.
Here is the 2-sided inequality that is proved in this article.
$(ln x)(ln (1-x)) le H(x) le (ln x)(ln (1-x))/ln 2, 0 < x < 1.$
A more elementary proof of the left inequality is provided as follows:
Writing out the entropy function and dividing both sides of the desired inequality by $(ln x)(ln (1-x)),$ (which is positive), gives
$1 le frac{-x}{ln (1-x)} + frac{-(1-x)}{ln x}, 0 < x < 1.,,, (1)$
Using the logarithmic inequality $ln y > (y-1)/sqrt{y};, 0 < y < 1$ with $y=x$ and $y= 1-x$ implies
$sqrt{x} +sqrt{1-x} < frac{-x}{ln (1-x)} + frac{-(1-x)}{ln x},,, (2)$.
Now, it is not difficult to verify that $1le sqrt{x} +sqrt{1-x}, 0 < x < 1$ and this together with $(2)$ implies $(1),$ which then implies the desired entropy inequality.
$endgroup$
It turns out that this inequality was proved in the following article:
M Bahramgiri and O Naghshineh Arjomand. A simple proof of the entropy inequality. RGMIA research report collection, 3(4), 2000.
The same article also provides an upper bound.
Here is the 2-sided inequality that is proved in this article.
$(ln x)(ln (1-x)) le H(x) le (ln x)(ln (1-x))/ln 2, 0 < x < 1.$
A more elementary proof of the left inequality is provided as follows:
Writing out the entropy function and dividing both sides of the desired inequality by $(ln x)(ln (1-x)),$ (which is positive), gives
$1 le frac{-x}{ln (1-x)} + frac{-(1-x)}{ln x}, 0 < x < 1.,,, (1)$
Using the logarithmic inequality $ln y > (y-1)/sqrt{y};, 0 < y < 1$ with $y=x$ and $y= 1-x$ implies
$sqrt{x} +sqrt{1-x} < frac{-x}{ln (1-x)} + frac{-(1-x)}{ln x},,, (2)$.
Now, it is not difficult to verify that $1le sqrt{x} +sqrt{1-x}, 0 < x < 1$ and this together with $(2)$ implies $(1),$ which then implies the desired entropy inequality.
edited Jan 23 at 11:38
answered Jan 9 at 12:57
AYOAYO
462
462
add a comment |
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3051628%2fa-simple-inequality-involving-the-entropy-function-hx%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
$begingroup$
The inequality does not hold (simple to check). However, plotting $ln (x) ln(1-x)$ indicates that it is actually a good approximation (not bound) of the entropy function.
$endgroup$
– Stelios
Dec 24 '18 at 21:40
$begingroup$
Can you please elaborate what you mean by a simple check? Tx.
$endgroup$
– AYO
Dec 24 '18 at 23:14
2
$begingroup$
$-2>-3$ does not imply $-2times-5>-3times-5$. Similarly here as $ln(1-x)<0$, multiplying by $ln x$ does not give a smaller number than multiplying by $x-1$, (both being negative).
$endgroup$
– Macavity
Dec 25 '18 at 3:08
$begingroup$
That makes sense—Tx. I've fixed the proof and will post it after I go over it.
$endgroup$
– AYO
Dec 26 '18 at 10:16