Negative Entropy $xlog x$ is convex












1












$begingroup$


can someone tell me (or show me), where can I find a proof for [f:mathbb{R}_{geq0}tomathbb{R},quad xmapsto xlog x] is convex without using the first or second derivative trick?



We call a map convex, then [f(lambda x+(1-lambda)tilde{x})leqlambda f(x)+(1-lambda)f(tilde{x})quadtext{for}~lambdain(0,1)]










share|cite|improve this question











$endgroup$












  • $begingroup$
    I know, that the function is convex and I also know that the reason is that the seond derivative is positiv for all $x>0$. What I search is a proof without the second derivative argument.
    $endgroup$
    – FuncAna09
    Jan 26 at 15:36










  • $begingroup$
    Welcome @FuncAna09! Could you write the definition of convex function you know? It would be easier to help you. :)
    $endgroup$
    – Ixion
    Jan 26 at 15:38










  • $begingroup$
    Why are you searching for a proof that avoids the second derivative? In this case, it's by far the cleanest argument.
    $endgroup$
    – Misha Lavrov
    Jan 26 at 15:51










  • $begingroup$
    Simply because I want to know how the evidence would go without the derivation.
    $endgroup$
    – FuncAna09
    Jan 26 at 15:58










  • $begingroup$
    I think it might be helpful to consider a slight perturbation of the subtitle of Stanley Kubrick's movie "Doctor Strangelove": "How I learned to stop worrying and love the second derivative trick".
    $endgroup$
    – Lee Mosher
    Jan 26 at 16:52


















1












$begingroup$


can someone tell me (or show me), where can I find a proof for [f:mathbb{R}_{geq0}tomathbb{R},quad xmapsto xlog x] is convex without using the first or second derivative trick?



We call a map convex, then [f(lambda x+(1-lambda)tilde{x})leqlambda f(x)+(1-lambda)f(tilde{x})quadtext{for}~lambdain(0,1)]










share|cite|improve this question











$endgroup$












  • $begingroup$
    I know, that the function is convex and I also know that the reason is that the seond derivative is positiv for all $x>0$. What I search is a proof without the second derivative argument.
    $endgroup$
    – FuncAna09
    Jan 26 at 15:36










  • $begingroup$
    Welcome @FuncAna09! Could you write the definition of convex function you know? It would be easier to help you. :)
    $endgroup$
    – Ixion
    Jan 26 at 15:38










  • $begingroup$
    Why are you searching for a proof that avoids the second derivative? In this case, it's by far the cleanest argument.
    $endgroup$
    – Misha Lavrov
    Jan 26 at 15:51










  • $begingroup$
    Simply because I want to know how the evidence would go without the derivation.
    $endgroup$
    – FuncAna09
    Jan 26 at 15:58










  • $begingroup$
    I think it might be helpful to consider a slight perturbation of the subtitle of Stanley Kubrick's movie "Doctor Strangelove": "How I learned to stop worrying and love the second derivative trick".
    $endgroup$
    – Lee Mosher
    Jan 26 at 16:52
















1












1








1





$begingroup$


can someone tell me (or show me), where can I find a proof for [f:mathbb{R}_{geq0}tomathbb{R},quad xmapsto xlog x] is convex without using the first or second derivative trick?



We call a map convex, then [f(lambda x+(1-lambda)tilde{x})leqlambda f(x)+(1-lambda)f(tilde{x})quadtext{for}~lambdain(0,1)]










share|cite|improve this question











$endgroup$




can someone tell me (or show me), where can I find a proof for [f:mathbb{R}_{geq0}tomathbb{R},quad xmapsto xlog x] is convex without using the first or second derivative trick?



We call a map convex, then [f(lambda x+(1-lambda)tilde{x})leqlambda f(x)+(1-lambda)f(tilde{x})quadtext{for}~lambdain(0,1)]







real-analysis






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Jan 26 at 22:24









Ethan Bolker

45.2k553120




45.2k553120










asked Jan 26 at 15:29









FuncAna09FuncAna09

113




113












  • $begingroup$
    I know, that the function is convex and I also know that the reason is that the seond derivative is positiv for all $x>0$. What I search is a proof without the second derivative argument.
    $endgroup$
    – FuncAna09
    Jan 26 at 15:36










  • $begingroup$
    Welcome @FuncAna09! Could you write the definition of convex function you know? It would be easier to help you. :)
    $endgroup$
    – Ixion
    Jan 26 at 15:38










  • $begingroup$
    Why are you searching for a proof that avoids the second derivative? In this case, it's by far the cleanest argument.
    $endgroup$
    – Misha Lavrov
    Jan 26 at 15:51










  • $begingroup$
    Simply because I want to know how the evidence would go without the derivation.
    $endgroup$
    – FuncAna09
    Jan 26 at 15:58










  • $begingroup$
    I think it might be helpful to consider a slight perturbation of the subtitle of Stanley Kubrick's movie "Doctor Strangelove": "How I learned to stop worrying and love the second derivative trick".
    $endgroup$
    – Lee Mosher
    Jan 26 at 16:52




















  • $begingroup$
    I know, that the function is convex and I also know that the reason is that the seond derivative is positiv for all $x>0$. What I search is a proof without the second derivative argument.
    $endgroup$
    – FuncAna09
    Jan 26 at 15:36










  • $begingroup$
    Welcome @FuncAna09! Could you write the definition of convex function you know? It would be easier to help you. :)
    $endgroup$
    – Ixion
    Jan 26 at 15:38










  • $begingroup$
    Why are you searching for a proof that avoids the second derivative? In this case, it's by far the cleanest argument.
    $endgroup$
    – Misha Lavrov
    Jan 26 at 15:51










  • $begingroup$
    Simply because I want to know how the evidence would go without the derivation.
    $endgroup$
    – FuncAna09
    Jan 26 at 15:58










  • $begingroup$
    I think it might be helpful to consider a slight perturbation of the subtitle of Stanley Kubrick's movie "Doctor Strangelove": "How I learned to stop worrying and love the second derivative trick".
    $endgroup$
    – Lee Mosher
    Jan 26 at 16:52


















$begingroup$
I know, that the function is convex and I also know that the reason is that the seond derivative is positiv for all $x>0$. What I search is a proof without the second derivative argument.
$endgroup$
– FuncAna09
Jan 26 at 15:36




$begingroup$
I know, that the function is convex and I also know that the reason is that the seond derivative is positiv for all $x>0$. What I search is a proof without the second derivative argument.
$endgroup$
– FuncAna09
Jan 26 at 15:36












$begingroup$
Welcome @FuncAna09! Could you write the definition of convex function you know? It would be easier to help you. :)
$endgroup$
– Ixion
Jan 26 at 15:38




$begingroup$
Welcome @FuncAna09! Could you write the definition of convex function you know? It would be easier to help you. :)
$endgroup$
– Ixion
Jan 26 at 15:38












$begingroup$
Why are you searching for a proof that avoids the second derivative? In this case, it's by far the cleanest argument.
$endgroup$
– Misha Lavrov
Jan 26 at 15:51




$begingroup$
Why are you searching for a proof that avoids the second derivative? In this case, it's by far the cleanest argument.
$endgroup$
– Misha Lavrov
Jan 26 at 15:51












$begingroup$
Simply because I want to know how the evidence would go without the derivation.
$endgroup$
– FuncAna09
Jan 26 at 15:58




$begingroup$
Simply because I want to know how the evidence would go without the derivation.
$endgroup$
– FuncAna09
Jan 26 at 15:58












$begingroup$
I think it might be helpful to consider a slight perturbation of the subtitle of Stanley Kubrick's movie "Doctor Strangelove": "How I learned to stop worrying and love the second derivative trick".
$endgroup$
– Lee Mosher
Jan 26 at 16:52






$begingroup$
I think it might be helpful to consider a slight perturbation of the subtitle of Stanley Kubrick's movie "Doctor Strangelove": "How I learned to stop worrying and love the second derivative trick".
$endgroup$
– Lee Mosher
Jan 26 at 16:52












3 Answers
3






active

oldest

votes


















1












$begingroup$

Answer to previous version of this question:



This is hair-splitting, but: the first derivative of your function is an increasing function, so your function is convex.



Hair splitting, because the first derivative trick is not the second derivative trick, and "increasing first derivative" is not exactly "non-negative second derivative", but yet implies convexity, too.






share|cite|improve this answer











$endgroup$













  • $begingroup$
    Okay that was my mistake. The question is as follows: Is there any evidence that only goes with the definition?
    $endgroup$
    – FuncAna09
    Jan 26 at 16:38





















0












$begingroup$

So I tried the proof, to prove that a function or set is convex we take two points $x$ and $y$ and we try proving that , $$f(lambda x + (1-lambda) y) leq lambda f(x) + (1-lambda) f(y)$$
The concept is simple if we joing two points $x$ and $y$ by a line then all the points lying on the line should also lie inside the set, the line being $lambda x + (1-lambda) y$ , if we vary lambda from $(0,1)$ we find at $lambda$ = $0$ we get the point $x$ at $lambda = 1$ we get the point $y$ and for the value of $lambda = 0.5$ we get the midpoint of $x$ and $y$.



Now for the proof:
$$f(x) = x log x$$
so $$f(lambda x + (1-lambda) y) = ({lambda x + (1-lambda) y})space log ({lambda x +(1-lambda)y})$$



The RHS becomes $$lambda x log (lambda x + (1-lambda) y)+ (1-lambda) y log (lambda x + (1-lambda) y$$



Take the first part and you will notice $lambda x log(lambda x + (1-lambda)y) geq lambda x log(lambda x) $ since $log(A+B) geq log A$



Again $$lambda x log lambda x = lambda x (log lambda + log x) geq lambda x log x$$
So this leads to $$lambda x log(lambda x + (1-lambda)y) geq lambda x log x$$
Similarly, $$(1-lambda) y log(lambda x + (1-lambda)y) geq (1-lambda) y log y$$



Adding these two equations you get,
**$$lambda x log(lambda x + (1-lambda)y) + (1-lambda) y log(lambda x + (1-lambda)y) geq lambda x log x + (1-lambda) y log y $$



This proves $$f(lambda x + (1-lambda) y) geq lambda f(x) + (1-lambda) f(y)$$ meaning that $f(x) = xlogx$ is not convex meaning that $-f(x)=-xlog x $ is convex.



Hope this helps …….






share|cite|improve this answer









$endgroup$









  • 1




    $begingroup$
    This can not be, because the second derivative is $f''(x)=frac{1}{x}>0~forall x>0$.
    $endgroup$
    – FuncAna09
    Jan 26 at 18:14








  • 1




    $begingroup$
    $log(lambda x)<log(x)$ because $0<lambda<1$. This was your mistake.
    $endgroup$
    – FuncAna09
    Jan 26 at 18:28



















0












$begingroup$

That's what I figured out: Let $lambdain (0,1)$ and $x,yinmathbb{R}>0$. Then we get
begin{align*}
f(lambda x+(1-lambda)y)&=[lambda x+(1-lambda)y]log(lambda x+(1-lambda)y)\
\
&=lambda xlog(lambda x+(1-lambda)y)+(1-lambda)ylog(lambda x+(1-lambda)y).
end{align*}

For the first term of RHS we get
[lambda xlog(lambda x+(1-lambda)y)leqlambda xlog(x)+lambda(1-lambda)y]
and for the second term we get
[(1-lambda)ylog(lambda x+(1-lambda)y)leq (1-lambda)ylog(y)+lambda(1-lambda)x]
In summary, this results in
[f(lambda x+(1-lambda)y)leq lambda f(x)+(1-lambda)f(y)+lambda(1-lambda)(x+y).]
The last term is the problem now.






share|cite|improve this answer











$endgroup$













    Your Answer





    StackExchange.ifUsing("editor", function () {
    return StackExchange.using("mathjaxEditing", function () {
    StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
    StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
    });
    });
    }, "mathjax-editing");

    StackExchange.ready(function() {
    var channelOptions = {
    tags: "".split(" "),
    id: "69"
    };
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function() {
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled) {
    StackExchange.using("snippets", function() {
    createEditor();
    });
    }
    else {
    createEditor();
    }
    });

    function createEditor() {
    StackExchange.prepareEditor({
    heartbeatType: 'answer',
    autoActivateHeartbeat: false,
    convertImagesToLinks: true,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    imageUploader: {
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    },
    noCode: true, onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    });


    }
    });














    draft saved

    draft discarded


















    StackExchange.ready(
    function () {
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3088375%2fnegative-entropy-x-log-x-is-convex%23new-answer', 'question_page');
    }
    );

    Post as a guest















    Required, but never shown

























    3 Answers
    3






    active

    oldest

    votes








    3 Answers
    3






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    1












    $begingroup$

    Answer to previous version of this question:



    This is hair-splitting, but: the first derivative of your function is an increasing function, so your function is convex.



    Hair splitting, because the first derivative trick is not the second derivative trick, and "increasing first derivative" is not exactly "non-negative second derivative", but yet implies convexity, too.






    share|cite|improve this answer











    $endgroup$













    • $begingroup$
      Okay that was my mistake. The question is as follows: Is there any evidence that only goes with the definition?
      $endgroup$
      – FuncAna09
      Jan 26 at 16:38


















    1












    $begingroup$

    Answer to previous version of this question:



    This is hair-splitting, but: the first derivative of your function is an increasing function, so your function is convex.



    Hair splitting, because the first derivative trick is not the second derivative trick, and "increasing first derivative" is not exactly "non-negative second derivative", but yet implies convexity, too.






    share|cite|improve this answer











    $endgroup$













    • $begingroup$
      Okay that was my mistake. The question is as follows: Is there any evidence that only goes with the definition?
      $endgroup$
      – FuncAna09
      Jan 26 at 16:38
















    1












    1








    1





    $begingroup$

    Answer to previous version of this question:



    This is hair-splitting, but: the first derivative of your function is an increasing function, so your function is convex.



    Hair splitting, because the first derivative trick is not the second derivative trick, and "increasing first derivative" is not exactly "non-negative second derivative", but yet implies convexity, too.






    share|cite|improve this answer











    $endgroup$



    Answer to previous version of this question:



    This is hair-splitting, but: the first derivative of your function is an increasing function, so your function is convex.



    Hair splitting, because the first derivative trick is not the second derivative trick, and "increasing first derivative" is not exactly "non-negative second derivative", but yet implies convexity, too.







    share|cite|improve this answer














    share|cite|improve this answer



    share|cite|improve this answer








    edited Jan 26 at 16:44

























    answered Jan 26 at 16:29









    kimchi loverkimchi lover

    11.5k31229




    11.5k31229












    • $begingroup$
      Okay that was my mistake. The question is as follows: Is there any evidence that only goes with the definition?
      $endgroup$
      – FuncAna09
      Jan 26 at 16:38




















    • $begingroup$
      Okay that was my mistake. The question is as follows: Is there any evidence that only goes with the definition?
      $endgroup$
      – FuncAna09
      Jan 26 at 16:38


















    $begingroup$
    Okay that was my mistake. The question is as follows: Is there any evidence that only goes with the definition?
    $endgroup$
    – FuncAna09
    Jan 26 at 16:38






    $begingroup$
    Okay that was my mistake. The question is as follows: Is there any evidence that only goes with the definition?
    $endgroup$
    – FuncAna09
    Jan 26 at 16:38













    0












    $begingroup$

    So I tried the proof, to prove that a function or set is convex we take two points $x$ and $y$ and we try proving that , $$f(lambda x + (1-lambda) y) leq lambda f(x) + (1-lambda) f(y)$$
    The concept is simple if we joing two points $x$ and $y$ by a line then all the points lying on the line should also lie inside the set, the line being $lambda x + (1-lambda) y$ , if we vary lambda from $(0,1)$ we find at $lambda$ = $0$ we get the point $x$ at $lambda = 1$ we get the point $y$ and for the value of $lambda = 0.5$ we get the midpoint of $x$ and $y$.



    Now for the proof:
    $$f(x) = x log x$$
    so $$f(lambda x + (1-lambda) y) = ({lambda x + (1-lambda) y})space log ({lambda x +(1-lambda)y})$$



    The RHS becomes $$lambda x log (lambda x + (1-lambda) y)+ (1-lambda) y log (lambda x + (1-lambda) y$$



    Take the first part and you will notice $lambda x log(lambda x + (1-lambda)y) geq lambda x log(lambda x) $ since $log(A+B) geq log A$



    Again $$lambda x log lambda x = lambda x (log lambda + log x) geq lambda x log x$$
    So this leads to $$lambda x log(lambda x + (1-lambda)y) geq lambda x log x$$
    Similarly, $$(1-lambda) y log(lambda x + (1-lambda)y) geq (1-lambda) y log y$$



    Adding these two equations you get,
    **$$lambda x log(lambda x + (1-lambda)y) + (1-lambda) y log(lambda x + (1-lambda)y) geq lambda x log x + (1-lambda) y log y $$



    This proves $$f(lambda x + (1-lambda) y) geq lambda f(x) + (1-lambda) f(y)$$ meaning that $f(x) = xlogx$ is not convex meaning that $-f(x)=-xlog x $ is convex.



    Hope this helps …….






    share|cite|improve this answer









    $endgroup$









    • 1




      $begingroup$
      This can not be, because the second derivative is $f''(x)=frac{1}{x}>0~forall x>0$.
      $endgroup$
      – FuncAna09
      Jan 26 at 18:14








    • 1




      $begingroup$
      $log(lambda x)<log(x)$ because $0<lambda<1$. This was your mistake.
      $endgroup$
      – FuncAna09
      Jan 26 at 18:28
















    0












    $begingroup$

    So I tried the proof, to prove that a function or set is convex we take two points $x$ and $y$ and we try proving that , $$f(lambda x + (1-lambda) y) leq lambda f(x) + (1-lambda) f(y)$$
    The concept is simple if we joing two points $x$ and $y$ by a line then all the points lying on the line should also lie inside the set, the line being $lambda x + (1-lambda) y$ , if we vary lambda from $(0,1)$ we find at $lambda$ = $0$ we get the point $x$ at $lambda = 1$ we get the point $y$ and for the value of $lambda = 0.5$ we get the midpoint of $x$ and $y$.



    Now for the proof:
    $$f(x) = x log x$$
    so $$f(lambda x + (1-lambda) y) = ({lambda x + (1-lambda) y})space log ({lambda x +(1-lambda)y})$$



    The RHS becomes $$lambda x log (lambda x + (1-lambda) y)+ (1-lambda) y log (lambda x + (1-lambda) y$$



    Take the first part and you will notice $lambda x log(lambda x + (1-lambda)y) geq lambda x log(lambda x) $ since $log(A+B) geq log A$



    Again $$lambda x log lambda x = lambda x (log lambda + log x) geq lambda x log x$$
    So this leads to $$lambda x log(lambda x + (1-lambda)y) geq lambda x log x$$
    Similarly, $$(1-lambda) y log(lambda x + (1-lambda)y) geq (1-lambda) y log y$$



    Adding these two equations you get,
    **$$lambda x log(lambda x + (1-lambda)y) + (1-lambda) y log(lambda x + (1-lambda)y) geq lambda x log x + (1-lambda) y log y $$



    This proves $$f(lambda x + (1-lambda) y) geq lambda f(x) + (1-lambda) f(y)$$ meaning that $f(x) = xlogx$ is not convex meaning that $-f(x)=-xlog x $ is convex.



    Hope this helps …….






    share|cite|improve this answer









    $endgroup$









    • 1




      $begingroup$
      This can not be, because the second derivative is $f''(x)=frac{1}{x}>0~forall x>0$.
      $endgroup$
      – FuncAna09
      Jan 26 at 18:14








    • 1




      $begingroup$
      $log(lambda x)<log(x)$ because $0<lambda<1$. This was your mistake.
      $endgroup$
      – FuncAna09
      Jan 26 at 18:28














    0












    0








    0





    $begingroup$

    So I tried the proof, to prove that a function or set is convex we take two points $x$ and $y$ and we try proving that , $$f(lambda x + (1-lambda) y) leq lambda f(x) + (1-lambda) f(y)$$
    The concept is simple if we joing two points $x$ and $y$ by a line then all the points lying on the line should also lie inside the set, the line being $lambda x + (1-lambda) y$ , if we vary lambda from $(0,1)$ we find at $lambda$ = $0$ we get the point $x$ at $lambda = 1$ we get the point $y$ and for the value of $lambda = 0.5$ we get the midpoint of $x$ and $y$.



    Now for the proof:
    $$f(x) = x log x$$
    so $$f(lambda x + (1-lambda) y) = ({lambda x + (1-lambda) y})space log ({lambda x +(1-lambda)y})$$



    The RHS becomes $$lambda x log (lambda x + (1-lambda) y)+ (1-lambda) y log (lambda x + (1-lambda) y$$



    Take the first part and you will notice $lambda x log(lambda x + (1-lambda)y) geq lambda x log(lambda x) $ since $log(A+B) geq log A$



    Again $$lambda x log lambda x = lambda x (log lambda + log x) geq lambda x log x$$
    So this leads to $$lambda x log(lambda x + (1-lambda)y) geq lambda x log x$$
    Similarly, $$(1-lambda) y log(lambda x + (1-lambda)y) geq (1-lambda) y log y$$



    Adding these two equations you get,
    **$$lambda x log(lambda x + (1-lambda)y) + (1-lambda) y log(lambda x + (1-lambda)y) geq lambda x log x + (1-lambda) y log y $$



    This proves $$f(lambda x + (1-lambda) y) geq lambda f(x) + (1-lambda) f(y)$$ meaning that $f(x) = xlogx$ is not convex meaning that $-f(x)=-xlog x $ is convex.



    Hope this helps …….






    share|cite|improve this answer









    $endgroup$



    So I tried the proof, to prove that a function or set is convex we take two points $x$ and $y$ and we try proving that , $$f(lambda x + (1-lambda) y) leq lambda f(x) + (1-lambda) f(y)$$
    The concept is simple if we joing two points $x$ and $y$ by a line then all the points lying on the line should also lie inside the set, the line being $lambda x + (1-lambda) y$ , if we vary lambda from $(0,1)$ we find at $lambda$ = $0$ we get the point $x$ at $lambda = 1$ we get the point $y$ and for the value of $lambda = 0.5$ we get the midpoint of $x$ and $y$.



    Now for the proof:
    $$f(x) = x log x$$
    so $$f(lambda x + (1-lambda) y) = ({lambda x + (1-lambda) y})space log ({lambda x +(1-lambda)y})$$



    The RHS becomes $$lambda x log (lambda x + (1-lambda) y)+ (1-lambda) y log (lambda x + (1-lambda) y$$



    Take the first part and you will notice $lambda x log(lambda x + (1-lambda)y) geq lambda x log(lambda x) $ since $log(A+B) geq log A$



    Again $$lambda x log lambda x = lambda x (log lambda + log x) geq lambda x log x$$
    So this leads to $$lambda x log(lambda x + (1-lambda)y) geq lambda x log x$$
    Similarly, $$(1-lambda) y log(lambda x + (1-lambda)y) geq (1-lambda) y log y$$



    Adding these two equations you get,
    **$$lambda x log(lambda x + (1-lambda)y) + (1-lambda) y log(lambda x + (1-lambda)y) geq lambda x log x + (1-lambda) y log y $$



    This proves $$f(lambda x + (1-lambda) y) geq lambda f(x) + (1-lambda) f(y)$$ meaning that $f(x) = xlogx$ is not convex meaning that $-f(x)=-xlog x $ is convex.



    Hope this helps …….







    share|cite|improve this answer












    share|cite|improve this answer



    share|cite|improve this answer










    answered Jan 26 at 17:35









    SNEHIL SANYALSNEHIL SANYAL

    654110




    654110








    • 1




      $begingroup$
      This can not be, because the second derivative is $f''(x)=frac{1}{x}>0~forall x>0$.
      $endgroup$
      – FuncAna09
      Jan 26 at 18:14








    • 1




      $begingroup$
      $log(lambda x)<log(x)$ because $0<lambda<1$. This was your mistake.
      $endgroup$
      – FuncAna09
      Jan 26 at 18:28














    • 1




      $begingroup$
      This can not be, because the second derivative is $f''(x)=frac{1}{x}>0~forall x>0$.
      $endgroup$
      – FuncAna09
      Jan 26 at 18:14








    • 1




      $begingroup$
      $log(lambda x)<log(x)$ because $0<lambda<1$. This was your mistake.
      $endgroup$
      – FuncAna09
      Jan 26 at 18:28








    1




    1




    $begingroup$
    This can not be, because the second derivative is $f''(x)=frac{1}{x}>0~forall x>0$.
    $endgroup$
    – FuncAna09
    Jan 26 at 18:14






    $begingroup$
    This can not be, because the second derivative is $f''(x)=frac{1}{x}>0~forall x>0$.
    $endgroup$
    – FuncAna09
    Jan 26 at 18:14






    1




    1




    $begingroup$
    $log(lambda x)<log(x)$ because $0<lambda<1$. This was your mistake.
    $endgroup$
    – FuncAna09
    Jan 26 at 18:28




    $begingroup$
    $log(lambda x)<log(x)$ because $0<lambda<1$. This was your mistake.
    $endgroup$
    – FuncAna09
    Jan 26 at 18:28











    0












    $begingroup$

    That's what I figured out: Let $lambdain (0,1)$ and $x,yinmathbb{R}>0$. Then we get
    begin{align*}
    f(lambda x+(1-lambda)y)&=[lambda x+(1-lambda)y]log(lambda x+(1-lambda)y)\
    \
    &=lambda xlog(lambda x+(1-lambda)y)+(1-lambda)ylog(lambda x+(1-lambda)y).
    end{align*}

    For the first term of RHS we get
    [lambda xlog(lambda x+(1-lambda)y)leqlambda xlog(x)+lambda(1-lambda)y]
    and for the second term we get
    [(1-lambda)ylog(lambda x+(1-lambda)y)leq (1-lambda)ylog(y)+lambda(1-lambda)x]
    In summary, this results in
    [f(lambda x+(1-lambda)y)leq lambda f(x)+(1-lambda)f(y)+lambda(1-lambda)(x+y).]
    The last term is the problem now.






    share|cite|improve this answer











    $endgroup$


















      0












      $begingroup$

      That's what I figured out: Let $lambdain (0,1)$ and $x,yinmathbb{R}>0$. Then we get
      begin{align*}
      f(lambda x+(1-lambda)y)&=[lambda x+(1-lambda)y]log(lambda x+(1-lambda)y)\
      \
      &=lambda xlog(lambda x+(1-lambda)y)+(1-lambda)ylog(lambda x+(1-lambda)y).
      end{align*}

      For the first term of RHS we get
      [lambda xlog(lambda x+(1-lambda)y)leqlambda xlog(x)+lambda(1-lambda)y]
      and for the second term we get
      [(1-lambda)ylog(lambda x+(1-lambda)y)leq (1-lambda)ylog(y)+lambda(1-lambda)x]
      In summary, this results in
      [f(lambda x+(1-lambda)y)leq lambda f(x)+(1-lambda)f(y)+lambda(1-lambda)(x+y).]
      The last term is the problem now.






      share|cite|improve this answer











      $endgroup$
















        0












        0








        0





        $begingroup$

        That's what I figured out: Let $lambdain (0,1)$ and $x,yinmathbb{R}>0$. Then we get
        begin{align*}
        f(lambda x+(1-lambda)y)&=[lambda x+(1-lambda)y]log(lambda x+(1-lambda)y)\
        \
        &=lambda xlog(lambda x+(1-lambda)y)+(1-lambda)ylog(lambda x+(1-lambda)y).
        end{align*}

        For the first term of RHS we get
        [lambda xlog(lambda x+(1-lambda)y)leqlambda xlog(x)+lambda(1-lambda)y]
        and for the second term we get
        [(1-lambda)ylog(lambda x+(1-lambda)y)leq (1-lambda)ylog(y)+lambda(1-lambda)x]
        In summary, this results in
        [f(lambda x+(1-lambda)y)leq lambda f(x)+(1-lambda)f(y)+lambda(1-lambda)(x+y).]
        The last term is the problem now.






        share|cite|improve this answer











        $endgroup$



        That's what I figured out: Let $lambdain (0,1)$ and $x,yinmathbb{R}>0$. Then we get
        begin{align*}
        f(lambda x+(1-lambda)y)&=[lambda x+(1-lambda)y]log(lambda x+(1-lambda)y)\
        \
        &=lambda xlog(lambda x+(1-lambda)y)+(1-lambda)ylog(lambda x+(1-lambda)y).
        end{align*}

        For the first term of RHS we get
        [lambda xlog(lambda x+(1-lambda)y)leqlambda xlog(x)+lambda(1-lambda)y]
        and for the second term we get
        [(1-lambda)ylog(lambda x+(1-lambda)y)leq (1-lambda)ylog(y)+lambda(1-lambda)x]
        In summary, this results in
        [f(lambda x+(1-lambda)y)leq lambda f(x)+(1-lambda)f(y)+lambda(1-lambda)(x+y).]
        The last term is the problem now.







        share|cite|improve this answer














        share|cite|improve this answer



        share|cite|improve this answer








        edited Jan 26 at 22:21

























        answered Jan 26 at 19:00









        FuncAna09FuncAna09

        113




        113






























            draft saved

            draft discarded




















































            Thanks for contributing an answer to Mathematics Stack Exchange!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            Use MathJax to format equations. MathJax reference.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3088375%2fnegative-entropy-x-log-x-is-convex%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            'app-layout' is not a known element: how to share Component with different Modules

            android studio warns about leanback feature tag usage required on manifest while using Unity exported app?

            WPF add header to Image with URL pettitions [duplicate]