Logistic Regression is convex proof












3












$begingroup$


I am trying to make sense of this paper qwone.com/~jason/writing/convexLR.pdf



"Regularized Logistic Regression is Strictly Convex" by Jason D. M. Rennie.



I am following the proof and formula (1) is a given:



$$
-ln(P(vec{y}mid X,vec{w})) = sum_{i=1}^N ln(1+e^{(-y_{i}vec{w}^Tvec{x}_i)}
$$



Assuming:



$$
g(z) = frac{1}{1+e^{-z}}
$$



I also see how



$$
1-g(z) = frac{e^{-z}}{1+e^{-z}}
$$



However, I don't follow how



$$
frac{partial g(z)}{partial z} = -g(z)(1-g(z))
$$



If I differentiate g(z) w.r.t. z I get:



$$
frac{partial g(z)}{partial z} = frac{e^{-z}}{(1+e^{-z})^2}
$$



which is $g(z)(1-g(z))$ not $-g(z)(1-g(z))$



Also, when doing (2) I get the negative of what is expressed there (taking into account it is performing the partial differential of - L.H.S. of (1)):



$$
frac{partial (-text{L.H.S. (1)} )}{partial w_j}
$$



Thanks in advance!










share|cite|improve this question











$endgroup$












  • $begingroup$
    your calculations are correct, not the paper you were reading.
    $endgroup$
    – Math-fun
    Feb 21 '15 at 17:22






  • 1




    $begingroup$
    Someone else can check? I have seen other successful proofs that Logistic Regression optimization is a convex problem, I'm just wondering if there is anything I'm not seeing...
    $endgroup$
    – user1064285
    Feb 22 '15 at 14:29
















3












$begingroup$


I am trying to make sense of this paper qwone.com/~jason/writing/convexLR.pdf



"Regularized Logistic Regression is Strictly Convex" by Jason D. M. Rennie.



I am following the proof and formula (1) is a given:



$$
-ln(P(vec{y}mid X,vec{w})) = sum_{i=1}^N ln(1+e^{(-y_{i}vec{w}^Tvec{x}_i)}
$$



Assuming:



$$
g(z) = frac{1}{1+e^{-z}}
$$



I also see how



$$
1-g(z) = frac{e^{-z}}{1+e^{-z}}
$$



However, I don't follow how



$$
frac{partial g(z)}{partial z} = -g(z)(1-g(z))
$$



If I differentiate g(z) w.r.t. z I get:



$$
frac{partial g(z)}{partial z} = frac{e^{-z}}{(1+e^{-z})^2}
$$



which is $g(z)(1-g(z))$ not $-g(z)(1-g(z))$



Also, when doing (2) I get the negative of what is expressed there (taking into account it is performing the partial differential of - L.H.S. of (1)):



$$
frac{partial (-text{L.H.S. (1)} )}{partial w_j}
$$



Thanks in advance!










share|cite|improve this question











$endgroup$












  • $begingroup$
    your calculations are correct, not the paper you were reading.
    $endgroup$
    – Math-fun
    Feb 21 '15 at 17:22






  • 1




    $begingroup$
    Someone else can check? I have seen other successful proofs that Logistic Regression optimization is a convex problem, I'm just wondering if there is anything I'm not seeing...
    $endgroup$
    – user1064285
    Feb 22 '15 at 14:29














3












3








3


1



$begingroup$


I am trying to make sense of this paper qwone.com/~jason/writing/convexLR.pdf



"Regularized Logistic Regression is Strictly Convex" by Jason D. M. Rennie.



I am following the proof and formula (1) is a given:



$$
-ln(P(vec{y}mid X,vec{w})) = sum_{i=1}^N ln(1+e^{(-y_{i}vec{w}^Tvec{x}_i)}
$$



Assuming:



$$
g(z) = frac{1}{1+e^{-z}}
$$



I also see how



$$
1-g(z) = frac{e^{-z}}{1+e^{-z}}
$$



However, I don't follow how



$$
frac{partial g(z)}{partial z} = -g(z)(1-g(z))
$$



If I differentiate g(z) w.r.t. z I get:



$$
frac{partial g(z)}{partial z} = frac{e^{-z}}{(1+e^{-z})^2}
$$



which is $g(z)(1-g(z))$ not $-g(z)(1-g(z))$



Also, when doing (2) I get the negative of what is expressed there (taking into account it is performing the partial differential of - L.H.S. of (1)):



$$
frac{partial (-text{L.H.S. (1)} )}{partial w_j}
$$



Thanks in advance!










share|cite|improve this question











$endgroup$




I am trying to make sense of this paper qwone.com/~jason/writing/convexLR.pdf



"Regularized Logistic Regression is Strictly Convex" by Jason D. M. Rennie.



I am following the proof and formula (1) is a given:



$$
-ln(P(vec{y}mid X,vec{w})) = sum_{i=1}^N ln(1+e^{(-y_{i}vec{w}^Tvec{x}_i)}
$$



Assuming:



$$
g(z) = frac{1}{1+e^{-z}}
$$



I also see how



$$
1-g(z) = frac{e^{-z}}{1+e^{-z}}
$$



However, I don't follow how



$$
frac{partial g(z)}{partial z} = -g(z)(1-g(z))
$$



If I differentiate g(z) w.r.t. z I get:



$$
frac{partial g(z)}{partial z} = frac{e^{-z}}{(1+e^{-z})^2}
$$



which is $g(z)(1-g(z))$ not $-g(z)(1-g(z))$



Also, when doing (2) I get the negative of what is expressed there (taking into account it is performing the partial differential of - L.H.S. of (1)):



$$
frac{partial (-text{L.H.S. (1)} )}{partial w_j}
$$



Thanks in advance!







convex-optimization






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Feb 21 '15 at 17:16









Michael Hardy

1




1










asked Feb 21 '15 at 17:14









user1064285user1064285

334




334












  • $begingroup$
    your calculations are correct, not the paper you were reading.
    $endgroup$
    – Math-fun
    Feb 21 '15 at 17:22






  • 1




    $begingroup$
    Someone else can check? I have seen other successful proofs that Logistic Regression optimization is a convex problem, I'm just wondering if there is anything I'm not seeing...
    $endgroup$
    – user1064285
    Feb 22 '15 at 14:29


















  • $begingroup$
    your calculations are correct, not the paper you were reading.
    $endgroup$
    – Math-fun
    Feb 21 '15 at 17:22






  • 1




    $begingroup$
    Someone else can check? I have seen other successful proofs that Logistic Regression optimization is a convex problem, I'm just wondering if there is anything I'm not seeing...
    $endgroup$
    – user1064285
    Feb 22 '15 at 14:29
















$begingroup$
your calculations are correct, not the paper you were reading.
$endgroup$
– Math-fun
Feb 21 '15 at 17:22




$begingroup$
your calculations are correct, not the paper you were reading.
$endgroup$
– Math-fun
Feb 21 '15 at 17:22




1




1




$begingroup$
Someone else can check? I have seen other successful proofs that Logistic Regression optimization is a convex problem, I'm just wondering if there is anything I'm not seeing...
$endgroup$
– user1064285
Feb 22 '15 at 14:29




$begingroup$
Someone else can check? I have seen other successful proofs that Logistic Regression optimization is a convex problem, I'm just wondering if there is anything I'm not seeing...
$endgroup$
– user1064285
Feb 22 '15 at 14:29










1 Answer
1






active

oldest

votes


















0












$begingroup$

Here is the graph of $displaystyle g(z)=frac{1}{1+e^{-z}}$:



enter image description here



which is clearly increasing, hence the derivative should be positive and as your calculations show $g'(z)=g(z)(1-g(z))>0$. This could be a typo in the paper you mentioned. Further equation (2) in the "paper" is correct since it is the derivative of $displaystylelog P(.)$ not $displaystyle-log P(.)$. The second derivative in the paper is also correct.






share|cite|improve this answer









$endgroup$













    Your Answer





    StackExchange.ifUsing("editor", function () {
    return StackExchange.using("mathjaxEditing", function () {
    StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
    StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
    });
    });
    }, "mathjax-editing");

    StackExchange.ready(function() {
    var channelOptions = {
    tags: "".split(" "),
    id: "69"
    };
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function() {
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled) {
    StackExchange.using("snippets", function() {
    createEditor();
    });
    }
    else {
    createEditor();
    }
    });

    function createEditor() {
    StackExchange.prepareEditor({
    heartbeatType: 'answer',
    autoActivateHeartbeat: false,
    convertImagesToLinks: true,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    imageUploader: {
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    },
    noCode: true, onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    });


    }
    });














    draft saved

    draft discarded


















    StackExchange.ready(
    function () {
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f1159033%2flogistic-regression-is-convex-proof%23new-answer', 'question_page');
    }
    );

    Post as a guest















    Required, but never shown

























    1 Answer
    1






    active

    oldest

    votes








    1 Answer
    1






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    0












    $begingroup$

    Here is the graph of $displaystyle g(z)=frac{1}{1+e^{-z}}$:



    enter image description here



    which is clearly increasing, hence the derivative should be positive and as your calculations show $g'(z)=g(z)(1-g(z))>0$. This could be a typo in the paper you mentioned. Further equation (2) in the "paper" is correct since it is the derivative of $displaystylelog P(.)$ not $displaystyle-log P(.)$. The second derivative in the paper is also correct.






    share|cite|improve this answer









    $endgroup$


















      0












      $begingroup$

      Here is the graph of $displaystyle g(z)=frac{1}{1+e^{-z}}$:



      enter image description here



      which is clearly increasing, hence the derivative should be positive and as your calculations show $g'(z)=g(z)(1-g(z))>0$. This could be a typo in the paper you mentioned. Further equation (2) in the "paper" is correct since it is the derivative of $displaystylelog P(.)$ not $displaystyle-log P(.)$. The second derivative in the paper is also correct.






      share|cite|improve this answer









      $endgroup$
















        0












        0








        0





        $begingroup$

        Here is the graph of $displaystyle g(z)=frac{1}{1+e^{-z}}$:



        enter image description here



        which is clearly increasing, hence the derivative should be positive and as your calculations show $g'(z)=g(z)(1-g(z))>0$. This could be a typo in the paper you mentioned. Further equation (2) in the "paper" is correct since it is the derivative of $displaystylelog P(.)$ not $displaystyle-log P(.)$. The second derivative in the paper is also correct.






        share|cite|improve this answer









        $endgroup$



        Here is the graph of $displaystyle g(z)=frac{1}{1+e^{-z}}$:



        enter image description here



        which is clearly increasing, hence the derivative should be positive and as your calculations show $g'(z)=g(z)(1-g(z))>0$. This could be a typo in the paper you mentioned. Further equation (2) in the "paper" is correct since it is the derivative of $displaystylelog P(.)$ not $displaystyle-log P(.)$. The second derivative in the paper is also correct.







        share|cite|improve this answer












        share|cite|improve this answer



        share|cite|improve this answer










        answered Feb 22 '15 at 15:45









        Math-funMath-fun

        7,0061425




        7,0061425






























            draft saved

            draft discarded




















































            Thanks for contributing an answer to Mathematics Stack Exchange!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            Use MathJax to format equations. MathJax reference.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f1159033%2flogistic-regression-is-convex-proof%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            MongoDB - Not Authorized To Execute Command

            How to fix TextFormField cause rebuild widget in Flutter

            Npm cannot find a required file even through it is in the searched directory