Convex Optimization - Minimizing the Frobenius Norm of a Matrix with Linear Inequality Constraint of its...












0












$begingroup$


I have the following optimization problem:



Minimize $|X|_F$ subject to $Axle b$



Where $X$ is a matrix in $mathbb{R}_{ntimes n}$ of variables, $x$ is the $n^2$ vector of those variables and $Ain mathbb{R}_{n^2times n^2},binmathbb{R}^{n^2}$, and $|cdot|_F$ is the Frobenius norm $|X|_F = sqrt{text{tr}(AA^T)}$.



If I understand correctly, this is a convex optimization problem and can be solved with convex optimization tools, but my search did not find an explicit treatment of the subject and I believe I'm missing obvious tricks.



What is the standard way of solving this problem using convex optimization? Are there any transformations which makes this, e.g. a semidefinite linear programming problem? A quadratic programming problem? etc.










share|cite|improve this question











$endgroup$












  • $begingroup$
    There is no constraint that $X$ be positive semi definite, right? The Frobenius norm of $X$ is simply the two norm of $x$.
    $endgroup$
    – Brian Borchers
    May 21 '18 at 15:35












  • $begingroup$
    How large is $n$? How much memory does $A$ take up?
    $endgroup$
    – littleO
    May 21 '18 at 16:39
















0












$begingroup$


I have the following optimization problem:



Minimize $|X|_F$ subject to $Axle b$



Where $X$ is a matrix in $mathbb{R}_{ntimes n}$ of variables, $x$ is the $n^2$ vector of those variables and $Ain mathbb{R}_{n^2times n^2},binmathbb{R}^{n^2}$, and $|cdot|_F$ is the Frobenius norm $|X|_F = sqrt{text{tr}(AA^T)}$.



If I understand correctly, this is a convex optimization problem and can be solved with convex optimization tools, but my search did not find an explicit treatment of the subject and I believe I'm missing obvious tricks.



What is the standard way of solving this problem using convex optimization? Are there any transformations which makes this, e.g. a semidefinite linear programming problem? A quadratic programming problem? etc.










share|cite|improve this question











$endgroup$












  • $begingroup$
    There is no constraint that $X$ be positive semi definite, right? The Frobenius norm of $X$ is simply the two norm of $x$.
    $endgroup$
    – Brian Borchers
    May 21 '18 at 15:35












  • $begingroup$
    How large is $n$? How much memory does $A$ take up?
    $endgroup$
    – littleO
    May 21 '18 at 16:39














0












0








0





$begingroup$


I have the following optimization problem:



Minimize $|X|_F$ subject to $Axle b$



Where $X$ is a matrix in $mathbb{R}_{ntimes n}$ of variables, $x$ is the $n^2$ vector of those variables and $Ain mathbb{R}_{n^2times n^2},binmathbb{R}^{n^2}$, and $|cdot|_F$ is the Frobenius norm $|X|_F = sqrt{text{tr}(AA^T)}$.



If I understand correctly, this is a convex optimization problem and can be solved with convex optimization tools, but my search did not find an explicit treatment of the subject and I believe I'm missing obvious tricks.



What is the standard way of solving this problem using convex optimization? Are there any transformations which makes this, e.g. a semidefinite linear programming problem? A quadratic programming problem? etc.










share|cite|improve this question











$endgroup$




I have the following optimization problem:



Minimize $|X|_F$ subject to $Axle b$



Where $X$ is a matrix in $mathbb{R}_{ntimes n}$ of variables, $x$ is the $n^2$ vector of those variables and $Ain mathbb{R}_{n^2times n^2},binmathbb{R}^{n^2}$, and $|cdot|_F$ is the Frobenius norm $|X|_F = sqrt{text{tr}(AA^T)}$.



If I understand correctly, this is a convex optimization problem and can be solved with convex optimization tools, but my search did not find an explicit treatment of the subject and I believe I'm missing obvious tricks.



What is the standard way of solving this problem using convex optimization? Are there any transformations which makes this, e.g. a semidefinite linear programming problem? A quadratic programming problem? etc.







linear-algebra optimization convex-optimization least-squares






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited May 22 '18 at 8:48









Royi

3,40512352




3,40512352










asked May 21 '18 at 12:10









Gadi AGadi A

11.8k35398




11.8k35398












  • $begingroup$
    There is no constraint that $X$ be positive semi definite, right? The Frobenius norm of $X$ is simply the two norm of $x$.
    $endgroup$
    – Brian Borchers
    May 21 '18 at 15:35












  • $begingroup$
    How large is $n$? How much memory does $A$ take up?
    $endgroup$
    – littleO
    May 21 '18 at 16:39


















  • $begingroup$
    There is no constraint that $X$ be positive semi definite, right? The Frobenius norm of $X$ is simply the two norm of $x$.
    $endgroup$
    – Brian Borchers
    May 21 '18 at 15:35












  • $begingroup$
    How large is $n$? How much memory does $A$ take up?
    $endgroup$
    – littleO
    May 21 '18 at 16:39
















$begingroup$
There is no constraint that $X$ be positive semi definite, right? The Frobenius norm of $X$ is simply the two norm of $x$.
$endgroup$
– Brian Borchers
May 21 '18 at 15:35






$begingroup$
There is no constraint that $X$ be positive semi definite, right? The Frobenius norm of $X$ is simply the two norm of $x$.
$endgroup$
– Brian Borchers
May 21 '18 at 15:35














$begingroup$
How large is $n$? How much memory does $A$ take up?
$endgroup$
– littleO
May 21 '18 at 16:39




$begingroup$
How large is $n$? How much memory does $A$ take up?
$endgroup$
– littleO
May 21 '18 at 16:39










2 Answers
2






active

oldest

votes


















3












$begingroup$

Minimizing $ {left| X right|}_{F} $ is equivalent of minimizing $ {left| X right|}_{F}^{2} $ which is equivalent of minimizing $ {left| x right|}_{2}^{2} $ where $ x = operatorname{vec} left( X right) $, namely the Vectorization Operator applied on $ X $.



Now you can write your problem as:



$$begin{align*}
arg min_{x} quad & frac{1}{2} {left| C x - d right|}_{2}^{2} \
text{subject to} quad & A x leq b
end{align*}$$



Where $ C = I $ and $ d = boldsymbol{0} $.



Now all you need is to utilize Linear Least Squares solver which supports Linear Inequality constraints.






share|cite|improve this answer











$endgroup$













  • $begingroup$
    That's not L1 minimization, Alec.
    $endgroup$
    – Dirk
    Jan 9 at 5:28



















1












$begingroup$

Since $|X|_F=|x|_2$, this is most naturally formulated as a second-order conic problem (SOCP).






share|cite|improve this answer









$endgroup$













    Your Answer





    StackExchange.ifUsing("editor", function () {
    return StackExchange.using("mathjaxEditing", function () {
    StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
    StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
    });
    });
    }, "mathjax-editing");

    StackExchange.ready(function() {
    var channelOptions = {
    tags: "".split(" "),
    id: "69"
    };
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function() {
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled) {
    StackExchange.using("snippets", function() {
    createEditor();
    });
    }
    else {
    createEditor();
    }
    });

    function createEditor() {
    StackExchange.prepareEditor({
    heartbeatType: 'answer',
    autoActivateHeartbeat: false,
    convertImagesToLinks: true,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    imageUploader: {
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    },
    noCode: true, onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    });


    }
    });














    draft saved

    draft discarded


















    StackExchange.ready(
    function () {
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2789981%2fconvex-optimization-minimizing-the-frobenius-norm-of-a-matrix-with-linear-ineq%23new-answer', 'question_page');
    }
    );

    Post as a guest















    Required, but never shown

























    2 Answers
    2






    active

    oldest

    votes








    2 Answers
    2






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    3












    $begingroup$

    Minimizing $ {left| X right|}_{F} $ is equivalent of minimizing $ {left| X right|}_{F}^{2} $ which is equivalent of minimizing $ {left| x right|}_{2}^{2} $ where $ x = operatorname{vec} left( X right) $, namely the Vectorization Operator applied on $ X $.



    Now you can write your problem as:



    $$begin{align*}
    arg min_{x} quad & frac{1}{2} {left| C x - d right|}_{2}^{2} \
    text{subject to} quad & A x leq b
    end{align*}$$



    Where $ C = I $ and $ d = boldsymbol{0} $.



    Now all you need is to utilize Linear Least Squares solver which supports Linear Inequality constraints.






    share|cite|improve this answer











    $endgroup$













    • $begingroup$
      That's not L1 minimization, Alec.
      $endgroup$
      – Dirk
      Jan 9 at 5:28
















    3












    $begingroup$

    Minimizing $ {left| X right|}_{F} $ is equivalent of minimizing $ {left| X right|}_{F}^{2} $ which is equivalent of minimizing $ {left| x right|}_{2}^{2} $ where $ x = operatorname{vec} left( X right) $, namely the Vectorization Operator applied on $ X $.



    Now you can write your problem as:



    $$begin{align*}
    arg min_{x} quad & frac{1}{2} {left| C x - d right|}_{2}^{2} \
    text{subject to} quad & A x leq b
    end{align*}$$



    Where $ C = I $ and $ d = boldsymbol{0} $.



    Now all you need is to utilize Linear Least Squares solver which supports Linear Inequality constraints.






    share|cite|improve this answer











    $endgroup$













    • $begingroup$
      That's not L1 minimization, Alec.
      $endgroup$
      – Dirk
      Jan 9 at 5:28














    3












    3








    3





    $begingroup$

    Minimizing $ {left| X right|}_{F} $ is equivalent of minimizing $ {left| X right|}_{F}^{2} $ which is equivalent of minimizing $ {left| x right|}_{2}^{2} $ where $ x = operatorname{vec} left( X right) $, namely the Vectorization Operator applied on $ X $.



    Now you can write your problem as:



    $$begin{align*}
    arg min_{x} quad & frac{1}{2} {left| C x - d right|}_{2}^{2} \
    text{subject to} quad & A x leq b
    end{align*}$$



    Where $ C = I $ and $ d = boldsymbol{0} $.



    Now all you need is to utilize Linear Least Squares solver which supports Linear Inequality constraints.






    share|cite|improve this answer











    $endgroup$



    Minimizing $ {left| X right|}_{F} $ is equivalent of minimizing $ {left| X right|}_{F}^{2} $ which is equivalent of minimizing $ {left| x right|}_{2}^{2} $ where $ x = operatorname{vec} left( X right) $, namely the Vectorization Operator applied on $ X $.



    Now you can write your problem as:



    $$begin{align*}
    arg min_{x} quad & frac{1}{2} {left| C x - d right|}_{2}^{2} \
    text{subject to} quad & A x leq b
    end{align*}$$



    Where $ C = I $ and $ d = boldsymbol{0} $.



    Now all you need is to utilize Linear Least Squares solver which supports Linear Inequality constraints.







    share|cite|improve this answer














    share|cite|improve this answer



    share|cite|improve this answer








    edited Jan 9 at 5:11









    Alec Jacobson

    270111




    270111










    answered May 21 '18 at 16:35









    RoyiRoyi

    3,40512352




    3,40512352












    • $begingroup$
      That's not L1 minimization, Alec.
      $endgroup$
      – Dirk
      Jan 9 at 5:28


















    • $begingroup$
      That's not L1 minimization, Alec.
      $endgroup$
      – Dirk
      Jan 9 at 5:28
















    $begingroup$
    That's not L1 minimization, Alec.
    $endgroup$
    – Dirk
    Jan 9 at 5:28




    $begingroup$
    That's not L1 minimization, Alec.
    $endgroup$
    – Dirk
    Jan 9 at 5:28











    1












    $begingroup$

    Since $|X|_F=|x|_2$, this is most naturally formulated as a second-order conic problem (SOCP).






    share|cite|improve this answer









    $endgroup$


















      1












      $begingroup$

      Since $|X|_F=|x|_2$, this is most naturally formulated as a second-order conic problem (SOCP).






      share|cite|improve this answer









      $endgroup$
















        1












        1








        1





        $begingroup$

        Since $|X|_F=|x|_2$, this is most naturally formulated as a second-order conic problem (SOCP).






        share|cite|improve this answer









        $endgroup$



        Since $|X|_F=|x|_2$, this is most naturally formulated as a second-order conic problem (SOCP).







        share|cite|improve this answer












        share|cite|improve this answer



        share|cite|improve this answer










        answered May 21 '18 at 17:40









        Michal AdamaszekMichal Adamaszek

        2,08148




        2,08148






























            draft saved

            draft discarded




















































            Thanks for contributing an answer to Mathematics Stack Exchange!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            Use MathJax to format equations. MathJax reference.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2789981%2fconvex-optimization-minimizing-the-frobenius-norm-of-a-matrix-with-linear-ineq%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            MongoDB - Not Authorized To Execute Command

            in spring boot 2.1 many test slices are not allowed anymore due to multiple @BootstrapWith

            Npm cannot find a required file even through it is in the searched directory