Convex Optimization - Minimizing the Frobenius Norm of a Matrix with Linear Inequality Constraint of its...
$begingroup$
I have the following optimization problem:
Minimize $|X|_F$ subject to $Axle b$
Where $X$ is a matrix in $mathbb{R}_{ntimes n}$ of variables, $x$ is the $n^2$ vector of those variables and $Ain mathbb{R}_{n^2times n^2},binmathbb{R}^{n^2}$, and $|cdot|_F$ is the Frobenius norm $|X|_F = sqrt{text{tr}(AA^T)}$.
If I understand correctly, this is a convex optimization problem and can be solved with convex optimization tools, but my search did not find an explicit treatment of the subject and I believe I'm missing obvious tricks.
What is the standard way of solving this problem using convex optimization? Are there any transformations which makes this, e.g. a semidefinite linear programming problem? A quadratic programming problem? etc.
linear-algebra optimization convex-optimization least-squares
$endgroup$
add a comment |
$begingroup$
I have the following optimization problem:
Minimize $|X|_F$ subject to $Axle b$
Where $X$ is a matrix in $mathbb{R}_{ntimes n}$ of variables, $x$ is the $n^2$ vector of those variables and $Ain mathbb{R}_{n^2times n^2},binmathbb{R}^{n^2}$, and $|cdot|_F$ is the Frobenius norm $|X|_F = sqrt{text{tr}(AA^T)}$.
If I understand correctly, this is a convex optimization problem and can be solved with convex optimization tools, but my search did not find an explicit treatment of the subject and I believe I'm missing obvious tricks.
What is the standard way of solving this problem using convex optimization? Are there any transformations which makes this, e.g. a semidefinite linear programming problem? A quadratic programming problem? etc.
linear-algebra optimization convex-optimization least-squares
$endgroup$
$begingroup$
There is no constraint that $X$ be positive semi definite, right? The Frobenius norm of $X$ is simply the two norm of $x$.
$endgroup$
– Brian Borchers
May 21 '18 at 15:35
$begingroup$
How large is $n$? How much memory does $A$ take up?
$endgroup$
– littleO
May 21 '18 at 16:39
add a comment |
$begingroup$
I have the following optimization problem:
Minimize $|X|_F$ subject to $Axle b$
Where $X$ is a matrix in $mathbb{R}_{ntimes n}$ of variables, $x$ is the $n^2$ vector of those variables and $Ain mathbb{R}_{n^2times n^2},binmathbb{R}^{n^2}$, and $|cdot|_F$ is the Frobenius norm $|X|_F = sqrt{text{tr}(AA^T)}$.
If I understand correctly, this is a convex optimization problem and can be solved with convex optimization tools, but my search did not find an explicit treatment of the subject and I believe I'm missing obvious tricks.
What is the standard way of solving this problem using convex optimization? Are there any transformations which makes this, e.g. a semidefinite linear programming problem? A quadratic programming problem? etc.
linear-algebra optimization convex-optimization least-squares
$endgroup$
I have the following optimization problem:
Minimize $|X|_F$ subject to $Axle b$
Where $X$ is a matrix in $mathbb{R}_{ntimes n}$ of variables, $x$ is the $n^2$ vector of those variables and $Ain mathbb{R}_{n^2times n^2},binmathbb{R}^{n^2}$, and $|cdot|_F$ is the Frobenius norm $|X|_F = sqrt{text{tr}(AA^T)}$.
If I understand correctly, this is a convex optimization problem and can be solved with convex optimization tools, but my search did not find an explicit treatment of the subject and I believe I'm missing obvious tricks.
What is the standard way of solving this problem using convex optimization? Are there any transformations which makes this, e.g. a semidefinite linear programming problem? A quadratic programming problem? etc.
linear-algebra optimization convex-optimization least-squares
linear-algebra optimization convex-optimization least-squares
edited May 22 '18 at 8:48
Royi
3,40512352
3,40512352
asked May 21 '18 at 12:10
Gadi AGadi A
11.8k35398
11.8k35398
$begingroup$
There is no constraint that $X$ be positive semi definite, right? The Frobenius norm of $X$ is simply the two norm of $x$.
$endgroup$
– Brian Borchers
May 21 '18 at 15:35
$begingroup$
How large is $n$? How much memory does $A$ take up?
$endgroup$
– littleO
May 21 '18 at 16:39
add a comment |
$begingroup$
There is no constraint that $X$ be positive semi definite, right? The Frobenius norm of $X$ is simply the two norm of $x$.
$endgroup$
– Brian Borchers
May 21 '18 at 15:35
$begingroup$
How large is $n$? How much memory does $A$ take up?
$endgroup$
– littleO
May 21 '18 at 16:39
$begingroup$
There is no constraint that $X$ be positive semi definite, right? The Frobenius norm of $X$ is simply the two norm of $x$.
$endgroup$
– Brian Borchers
May 21 '18 at 15:35
$begingroup$
There is no constraint that $X$ be positive semi definite, right? The Frobenius norm of $X$ is simply the two norm of $x$.
$endgroup$
– Brian Borchers
May 21 '18 at 15:35
$begingroup$
How large is $n$? How much memory does $A$ take up?
$endgroup$
– littleO
May 21 '18 at 16:39
$begingroup$
How large is $n$? How much memory does $A$ take up?
$endgroup$
– littleO
May 21 '18 at 16:39
add a comment |
2 Answers
2
active
oldest
votes
$begingroup$
Minimizing $ {left| X right|}_{F} $ is equivalent of minimizing $ {left| X right|}_{F}^{2} $ which is equivalent of minimizing $ {left| x right|}_{2}^{2} $ where $ x = operatorname{vec} left( X right) $, namely the Vectorization Operator applied on $ X $.
Now you can write your problem as:
$$begin{align*}
arg min_{x} quad & frac{1}{2} {left| C x - d right|}_{2}^{2} \
text{subject to} quad & A x leq b
end{align*}$$
Where $ C = I $ and $ d = boldsymbol{0} $.
Now all you need is to utilize Linear Least Squares solver which supports Linear Inequality constraints.
$endgroup$
$begingroup$
That's not L1 minimization, Alec.
$endgroup$
– Dirk
Jan 9 at 5:28
add a comment |
$begingroup$
Since $|X|_F=|x|_2$, this is most naturally formulated as a second-order conic problem (SOCP).
$endgroup$
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2789981%2fconvex-optimization-minimizing-the-frobenius-norm-of-a-matrix-with-linear-ineq%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
Minimizing $ {left| X right|}_{F} $ is equivalent of minimizing $ {left| X right|}_{F}^{2} $ which is equivalent of minimizing $ {left| x right|}_{2}^{2} $ where $ x = operatorname{vec} left( X right) $, namely the Vectorization Operator applied on $ X $.
Now you can write your problem as:
$$begin{align*}
arg min_{x} quad & frac{1}{2} {left| C x - d right|}_{2}^{2} \
text{subject to} quad & A x leq b
end{align*}$$
Where $ C = I $ and $ d = boldsymbol{0} $.
Now all you need is to utilize Linear Least Squares solver which supports Linear Inequality constraints.
$endgroup$
$begingroup$
That's not L1 minimization, Alec.
$endgroup$
– Dirk
Jan 9 at 5:28
add a comment |
$begingroup$
Minimizing $ {left| X right|}_{F} $ is equivalent of minimizing $ {left| X right|}_{F}^{2} $ which is equivalent of minimizing $ {left| x right|}_{2}^{2} $ where $ x = operatorname{vec} left( X right) $, namely the Vectorization Operator applied on $ X $.
Now you can write your problem as:
$$begin{align*}
arg min_{x} quad & frac{1}{2} {left| C x - d right|}_{2}^{2} \
text{subject to} quad & A x leq b
end{align*}$$
Where $ C = I $ and $ d = boldsymbol{0} $.
Now all you need is to utilize Linear Least Squares solver which supports Linear Inequality constraints.
$endgroup$
$begingroup$
That's not L1 minimization, Alec.
$endgroup$
– Dirk
Jan 9 at 5:28
add a comment |
$begingroup$
Minimizing $ {left| X right|}_{F} $ is equivalent of minimizing $ {left| X right|}_{F}^{2} $ which is equivalent of minimizing $ {left| x right|}_{2}^{2} $ where $ x = operatorname{vec} left( X right) $, namely the Vectorization Operator applied on $ X $.
Now you can write your problem as:
$$begin{align*}
arg min_{x} quad & frac{1}{2} {left| C x - d right|}_{2}^{2} \
text{subject to} quad & A x leq b
end{align*}$$
Where $ C = I $ and $ d = boldsymbol{0} $.
Now all you need is to utilize Linear Least Squares solver which supports Linear Inequality constraints.
$endgroup$
Minimizing $ {left| X right|}_{F} $ is equivalent of minimizing $ {left| X right|}_{F}^{2} $ which is equivalent of minimizing $ {left| x right|}_{2}^{2} $ where $ x = operatorname{vec} left( X right) $, namely the Vectorization Operator applied on $ X $.
Now you can write your problem as:
$$begin{align*}
arg min_{x} quad & frac{1}{2} {left| C x - d right|}_{2}^{2} \
text{subject to} quad & A x leq b
end{align*}$$
Where $ C = I $ and $ d = boldsymbol{0} $.
Now all you need is to utilize Linear Least Squares solver which supports Linear Inequality constraints.
edited Jan 9 at 5:11
Alec Jacobson
270111
270111
answered May 21 '18 at 16:35
RoyiRoyi
3,40512352
3,40512352
$begingroup$
That's not L1 minimization, Alec.
$endgroup$
– Dirk
Jan 9 at 5:28
add a comment |
$begingroup$
That's not L1 minimization, Alec.
$endgroup$
– Dirk
Jan 9 at 5:28
$begingroup$
That's not L1 minimization, Alec.
$endgroup$
– Dirk
Jan 9 at 5:28
$begingroup$
That's not L1 minimization, Alec.
$endgroup$
– Dirk
Jan 9 at 5:28
add a comment |
$begingroup$
Since $|X|_F=|x|_2$, this is most naturally formulated as a second-order conic problem (SOCP).
$endgroup$
add a comment |
$begingroup$
Since $|X|_F=|x|_2$, this is most naturally formulated as a second-order conic problem (SOCP).
$endgroup$
add a comment |
$begingroup$
Since $|X|_F=|x|_2$, this is most naturally formulated as a second-order conic problem (SOCP).
$endgroup$
Since $|X|_F=|x|_2$, this is most naturally formulated as a second-order conic problem (SOCP).
answered May 21 '18 at 17:40


Michal AdamaszekMichal Adamaszek
2,08148
2,08148
add a comment |
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2789981%2fconvex-optimization-minimizing-the-frobenius-norm-of-a-matrix-with-linear-ineq%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
$begingroup$
There is no constraint that $X$ be positive semi definite, right? The Frobenius norm of $X$ is simply the two norm of $x$.
$endgroup$
– Brian Borchers
May 21 '18 at 15:35
$begingroup$
How large is $n$? How much memory does $A$ take up?
$endgroup$
– littleO
May 21 '18 at 16:39