Equivalent Convex programs with different solutions
$begingroup$
Let $R_kappa in mathbb{R}^{d times d}_{sym}$, $S_kappa in mathbb{R}^{d times d}_{sym}$, $eta_kappa in mathbb{R}^+$, for a set ${ kappa }$. Define a optimization problem $(1)$ as
begin{align}
&min_{{S_kappa}} sum_kappa eta_kappa exp( tr(R_kappa S_kappa) )\
text{s.t.}quad& 2 (sum_kappa exp(tr(S_kappa)/2) - C) leq 0\
&frac{1}{2}(| S_kappa |^2_{Fr} -alpha^2 ) leq 0,quad forall kappa
end{align}
which is convex, and provided there is a feasible interior point has an optimal solution.
If I then have the function
$$ phi(x) = begin{cases} x^2,&x>0\0,& xleq 0 end{cases}$$
and apply this to the second constraint I can get optimization problem $(2)$
begin{align}
&min_{{S_kappa}} sum_kappa eta_kappa exp( tr(R_kappa S_kappa) )\
text{s.t.}quad& 2 (sum_kappa exp(tr(S_kappa)/2) - C) leq 0\
&frac{1}{4}phi(| S_kappa |^2_{Fr} -alpha^2)leq 0,quad forall kappa
end{align}
$phi$ is a convex function, and the feasible sets for both optimizations are the same, thus both $(1)$ and $(2)$ should have the same solution.
The first KKT condition for $(1)$ applied to the Lagrangian gives
begin{align}
0 &= eta_kappa R_kappa exp(tr(R_kappa S_kappa) ) + lambda mathcal{I} + mu_kappa S_kappa
end{align}
and applying the trace decomposition $A = amathcal{I} + tilde{A}$ where $tr(tilde{A}) = 0$ gives
begin{align}
0 &= eta_kappa tilde{R}_kappa exp(tr(R_kappa S_kappa) ) + mu_kappa tilde{S}_kappa
end{align}
The Lagrangian of $(2)$ is
begin{align}
sum_kappa eta_kappa exp(tr(R_kappa S_kappa) + 2lambda (sum_kappa exp(tr(S_kappa)/2) - C) + sum_kappa frac{mu_kappa}{4}phi(| S_kappa |^2_{Fr} -alpha^2)
end{align}
differentiating w.r.t $S_kappa$ and setting equal to $0$ gives
begin{align}
0 &= eta_kappa R_kappa exp(tr(R_kappa S_kappa) ) + lambda mathcal{I} + mu_kappa S_kappa begin{cases} (| S_kappa |^2_{Fr} -alpha^2), & | S_kappa |^2_{Fr} -alpha^2 > 0\
0, & | S_kappa |^2_{Fr} -alpha^2 leq 0
end{cases}
end{align}
applying the trace decomposition gives
begin{align}
0 &= eta_kappa tilde{R}_kappa exp(tr(R_kappa S_kappa) ) + mu_kappa tilde{S}_kappa begin{cases} (| S_kappa |^2_{Fr} -alpha^2), & | S_kappa |^2_{Fr} -alpha^2 > 0\
0, & | S_kappa |^2_{Fr} -alpha^2 leq 0
end{cases}
end{align}
which gives a contradiction: from primal feasibility we are on the $0$ branch, but we also know that $tilde{R}_kappaneq 0$. What have I done wrong here?
optimization convex-optimization nonlinear-optimization karush-kuhn-tucker
$endgroup$
add a comment |
$begingroup$
Let $R_kappa in mathbb{R}^{d times d}_{sym}$, $S_kappa in mathbb{R}^{d times d}_{sym}$, $eta_kappa in mathbb{R}^+$, for a set ${ kappa }$. Define a optimization problem $(1)$ as
begin{align}
&min_{{S_kappa}} sum_kappa eta_kappa exp( tr(R_kappa S_kappa) )\
text{s.t.}quad& 2 (sum_kappa exp(tr(S_kappa)/2) - C) leq 0\
&frac{1}{2}(| S_kappa |^2_{Fr} -alpha^2 ) leq 0,quad forall kappa
end{align}
which is convex, and provided there is a feasible interior point has an optimal solution.
If I then have the function
$$ phi(x) = begin{cases} x^2,&x>0\0,& xleq 0 end{cases}$$
and apply this to the second constraint I can get optimization problem $(2)$
begin{align}
&min_{{S_kappa}} sum_kappa eta_kappa exp( tr(R_kappa S_kappa) )\
text{s.t.}quad& 2 (sum_kappa exp(tr(S_kappa)/2) - C) leq 0\
&frac{1}{4}phi(| S_kappa |^2_{Fr} -alpha^2)leq 0,quad forall kappa
end{align}
$phi$ is a convex function, and the feasible sets for both optimizations are the same, thus both $(1)$ and $(2)$ should have the same solution.
The first KKT condition for $(1)$ applied to the Lagrangian gives
begin{align}
0 &= eta_kappa R_kappa exp(tr(R_kappa S_kappa) ) + lambda mathcal{I} + mu_kappa S_kappa
end{align}
and applying the trace decomposition $A = amathcal{I} + tilde{A}$ where $tr(tilde{A}) = 0$ gives
begin{align}
0 &= eta_kappa tilde{R}_kappa exp(tr(R_kappa S_kappa) ) + mu_kappa tilde{S}_kappa
end{align}
The Lagrangian of $(2)$ is
begin{align}
sum_kappa eta_kappa exp(tr(R_kappa S_kappa) + 2lambda (sum_kappa exp(tr(S_kappa)/2) - C) + sum_kappa frac{mu_kappa}{4}phi(| S_kappa |^2_{Fr} -alpha^2)
end{align}
differentiating w.r.t $S_kappa$ and setting equal to $0$ gives
begin{align}
0 &= eta_kappa R_kappa exp(tr(R_kappa S_kappa) ) + lambda mathcal{I} + mu_kappa S_kappa begin{cases} (| S_kappa |^2_{Fr} -alpha^2), & | S_kappa |^2_{Fr} -alpha^2 > 0\
0, & | S_kappa |^2_{Fr} -alpha^2 leq 0
end{cases}
end{align}
applying the trace decomposition gives
begin{align}
0 &= eta_kappa tilde{R}_kappa exp(tr(R_kappa S_kappa) ) + mu_kappa tilde{S}_kappa begin{cases} (| S_kappa |^2_{Fr} -alpha^2), & | S_kappa |^2_{Fr} -alpha^2 > 0\
0, & | S_kappa |^2_{Fr} -alpha^2 leq 0
end{cases}
end{align}
which gives a contradiction: from primal feasibility we are on the $0$ branch, but we also know that $tilde{R}_kappaneq 0$. What have I done wrong here?
optimization convex-optimization nonlinear-optimization karush-kuhn-tucker
$endgroup$
$begingroup$
Taking the derivative of the trace gives the identity, $frac{d tr(S_kappa)}{d S_kappa} = mathcal{I}$, and taking the derivative of the exponential of trace comes from the chain rule.
$endgroup$
– NeedsToKnowMoreMaths
Jan 15 at 20:03
1
$begingroup$
You need some regularity conditions to apply KKT. For example, the second version of the second constraint can never be strictly feasible.
$endgroup$
– copper.hat
Jan 15 at 20:33
$begingroup$
So then the second problem has the same primal solution, but does not have a dual solution?
$endgroup$
– NeedsToKnowMoreMaths
Jan 15 at 21:24
$begingroup$
I would have to think about that. The first problem has no duality gap because of Slater. I would need to do a little work to check if strong duality holds for the second problem.
$endgroup$
– copper.hat
Jan 15 at 21:31
add a comment |
$begingroup$
Let $R_kappa in mathbb{R}^{d times d}_{sym}$, $S_kappa in mathbb{R}^{d times d}_{sym}$, $eta_kappa in mathbb{R}^+$, for a set ${ kappa }$. Define a optimization problem $(1)$ as
begin{align}
&min_{{S_kappa}} sum_kappa eta_kappa exp( tr(R_kappa S_kappa) )\
text{s.t.}quad& 2 (sum_kappa exp(tr(S_kappa)/2) - C) leq 0\
&frac{1}{2}(| S_kappa |^2_{Fr} -alpha^2 ) leq 0,quad forall kappa
end{align}
which is convex, and provided there is a feasible interior point has an optimal solution.
If I then have the function
$$ phi(x) = begin{cases} x^2,&x>0\0,& xleq 0 end{cases}$$
and apply this to the second constraint I can get optimization problem $(2)$
begin{align}
&min_{{S_kappa}} sum_kappa eta_kappa exp( tr(R_kappa S_kappa) )\
text{s.t.}quad& 2 (sum_kappa exp(tr(S_kappa)/2) - C) leq 0\
&frac{1}{4}phi(| S_kappa |^2_{Fr} -alpha^2)leq 0,quad forall kappa
end{align}
$phi$ is a convex function, and the feasible sets for both optimizations are the same, thus both $(1)$ and $(2)$ should have the same solution.
The first KKT condition for $(1)$ applied to the Lagrangian gives
begin{align}
0 &= eta_kappa R_kappa exp(tr(R_kappa S_kappa) ) + lambda mathcal{I} + mu_kappa S_kappa
end{align}
and applying the trace decomposition $A = amathcal{I} + tilde{A}$ where $tr(tilde{A}) = 0$ gives
begin{align}
0 &= eta_kappa tilde{R}_kappa exp(tr(R_kappa S_kappa) ) + mu_kappa tilde{S}_kappa
end{align}
The Lagrangian of $(2)$ is
begin{align}
sum_kappa eta_kappa exp(tr(R_kappa S_kappa) + 2lambda (sum_kappa exp(tr(S_kappa)/2) - C) + sum_kappa frac{mu_kappa}{4}phi(| S_kappa |^2_{Fr} -alpha^2)
end{align}
differentiating w.r.t $S_kappa$ and setting equal to $0$ gives
begin{align}
0 &= eta_kappa R_kappa exp(tr(R_kappa S_kappa) ) + lambda mathcal{I} + mu_kappa S_kappa begin{cases} (| S_kappa |^2_{Fr} -alpha^2), & | S_kappa |^2_{Fr} -alpha^2 > 0\
0, & | S_kappa |^2_{Fr} -alpha^2 leq 0
end{cases}
end{align}
applying the trace decomposition gives
begin{align}
0 &= eta_kappa tilde{R}_kappa exp(tr(R_kappa S_kappa) ) + mu_kappa tilde{S}_kappa begin{cases} (| S_kappa |^2_{Fr} -alpha^2), & | S_kappa |^2_{Fr} -alpha^2 > 0\
0, & | S_kappa |^2_{Fr} -alpha^2 leq 0
end{cases}
end{align}
which gives a contradiction: from primal feasibility we are on the $0$ branch, but we also know that $tilde{R}_kappaneq 0$. What have I done wrong here?
optimization convex-optimization nonlinear-optimization karush-kuhn-tucker
$endgroup$
Let $R_kappa in mathbb{R}^{d times d}_{sym}$, $S_kappa in mathbb{R}^{d times d}_{sym}$, $eta_kappa in mathbb{R}^+$, for a set ${ kappa }$. Define a optimization problem $(1)$ as
begin{align}
&min_{{S_kappa}} sum_kappa eta_kappa exp( tr(R_kappa S_kappa) )\
text{s.t.}quad& 2 (sum_kappa exp(tr(S_kappa)/2) - C) leq 0\
&frac{1}{2}(| S_kappa |^2_{Fr} -alpha^2 ) leq 0,quad forall kappa
end{align}
which is convex, and provided there is a feasible interior point has an optimal solution.
If I then have the function
$$ phi(x) = begin{cases} x^2,&x>0\0,& xleq 0 end{cases}$$
and apply this to the second constraint I can get optimization problem $(2)$
begin{align}
&min_{{S_kappa}} sum_kappa eta_kappa exp( tr(R_kappa S_kappa) )\
text{s.t.}quad& 2 (sum_kappa exp(tr(S_kappa)/2) - C) leq 0\
&frac{1}{4}phi(| S_kappa |^2_{Fr} -alpha^2)leq 0,quad forall kappa
end{align}
$phi$ is a convex function, and the feasible sets for both optimizations are the same, thus both $(1)$ and $(2)$ should have the same solution.
The first KKT condition for $(1)$ applied to the Lagrangian gives
begin{align}
0 &= eta_kappa R_kappa exp(tr(R_kappa S_kappa) ) + lambda mathcal{I} + mu_kappa S_kappa
end{align}
and applying the trace decomposition $A = amathcal{I} + tilde{A}$ where $tr(tilde{A}) = 0$ gives
begin{align}
0 &= eta_kappa tilde{R}_kappa exp(tr(R_kappa S_kappa) ) + mu_kappa tilde{S}_kappa
end{align}
The Lagrangian of $(2)$ is
begin{align}
sum_kappa eta_kappa exp(tr(R_kappa S_kappa) + 2lambda (sum_kappa exp(tr(S_kappa)/2) - C) + sum_kappa frac{mu_kappa}{4}phi(| S_kappa |^2_{Fr} -alpha^2)
end{align}
differentiating w.r.t $S_kappa$ and setting equal to $0$ gives
begin{align}
0 &= eta_kappa R_kappa exp(tr(R_kappa S_kappa) ) + lambda mathcal{I} + mu_kappa S_kappa begin{cases} (| S_kappa |^2_{Fr} -alpha^2), & | S_kappa |^2_{Fr} -alpha^2 > 0\
0, & | S_kappa |^2_{Fr} -alpha^2 leq 0
end{cases}
end{align}
applying the trace decomposition gives
begin{align}
0 &= eta_kappa tilde{R}_kappa exp(tr(R_kappa S_kappa) ) + mu_kappa tilde{S}_kappa begin{cases} (| S_kappa |^2_{Fr} -alpha^2), & | S_kappa |^2_{Fr} -alpha^2 > 0\
0, & | S_kappa |^2_{Fr} -alpha^2 leq 0
end{cases}
end{align}
which gives a contradiction: from primal feasibility we are on the $0$ branch, but we also know that $tilde{R}_kappaneq 0$. What have I done wrong here?
optimization convex-optimization nonlinear-optimization karush-kuhn-tucker
optimization convex-optimization nonlinear-optimization karush-kuhn-tucker
asked Jan 15 at 19:44
NeedsToKnowMoreMathsNeedsToKnowMoreMaths
728
728
$begingroup$
Taking the derivative of the trace gives the identity, $frac{d tr(S_kappa)}{d S_kappa} = mathcal{I}$, and taking the derivative of the exponential of trace comes from the chain rule.
$endgroup$
– NeedsToKnowMoreMaths
Jan 15 at 20:03
1
$begingroup$
You need some regularity conditions to apply KKT. For example, the second version of the second constraint can never be strictly feasible.
$endgroup$
– copper.hat
Jan 15 at 20:33
$begingroup$
So then the second problem has the same primal solution, but does not have a dual solution?
$endgroup$
– NeedsToKnowMoreMaths
Jan 15 at 21:24
$begingroup$
I would have to think about that. The first problem has no duality gap because of Slater. I would need to do a little work to check if strong duality holds for the second problem.
$endgroup$
– copper.hat
Jan 15 at 21:31
add a comment |
$begingroup$
Taking the derivative of the trace gives the identity, $frac{d tr(S_kappa)}{d S_kappa} = mathcal{I}$, and taking the derivative of the exponential of trace comes from the chain rule.
$endgroup$
– NeedsToKnowMoreMaths
Jan 15 at 20:03
1
$begingroup$
You need some regularity conditions to apply KKT. For example, the second version of the second constraint can never be strictly feasible.
$endgroup$
– copper.hat
Jan 15 at 20:33
$begingroup$
So then the second problem has the same primal solution, but does not have a dual solution?
$endgroup$
– NeedsToKnowMoreMaths
Jan 15 at 21:24
$begingroup$
I would have to think about that. The first problem has no duality gap because of Slater. I would need to do a little work to check if strong duality holds for the second problem.
$endgroup$
– copper.hat
Jan 15 at 21:31
$begingroup$
Taking the derivative of the trace gives the identity, $frac{d tr(S_kappa)}{d S_kappa} = mathcal{I}$, and taking the derivative of the exponential of trace comes from the chain rule.
$endgroup$
– NeedsToKnowMoreMaths
Jan 15 at 20:03
$begingroup$
Taking the derivative of the trace gives the identity, $frac{d tr(S_kappa)}{d S_kappa} = mathcal{I}$, and taking the derivative of the exponential of trace comes from the chain rule.
$endgroup$
– NeedsToKnowMoreMaths
Jan 15 at 20:03
1
1
$begingroup$
You need some regularity conditions to apply KKT. For example, the second version of the second constraint can never be strictly feasible.
$endgroup$
– copper.hat
Jan 15 at 20:33
$begingroup$
You need some regularity conditions to apply KKT. For example, the second version of the second constraint can never be strictly feasible.
$endgroup$
– copper.hat
Jan 15 at 20:33
$begingroup$
So then the second problem has the same primal solution, but does not have a dual solution?
$endgroup$
– NeedsToKnowMoreMaths
Jan 15 at 21:24
$begingroup$
So then the second problem has the same primal solution, but does not have a dual solution?
$endgroup$
– NeedsToKnowMoreMaths
Jan 15 at 21:24
$begingroup$
I would have to think about that. The first problem has no duality gap because of Slater. I would need to do a little work to check if strong duality holds for the second problem.
$endgroup$
– copper.hat
Jan 15 at 21:31
$begingroup$
I would have to think about that. The first problem has no duality gap because of Slater. I would need to do a little work to check if strong duality holds for the second problem.
$endgroup$
– copper.hat
Jan 15 at 21:31
add a comment |
0
active
oldest
votes
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3074868%2fequivalent-convex-programs-with-different-solutions%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
0
active
oldest
votes
0
active
oldest
votes
active
oldest
votes
active
oldest
votes
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3074868%2fequivalent-convex-programs-with-different-solutions%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
$begingroup$
Taking the derivative of the trace gives the identity, $frac{d tr(S_kappa)}{d S_kappa} = mathcal{I}$, and taking the derivative of the exponential of trace comes from the chain rule.
$endgroup$
– NeedsToKnowMoreMaths
Jan 15 at 20:03
1
$begingroup$
You need some regularity conditions to apply KKT. For example, the second version of the second constraint can never be strictly feasible.
$endgroup$
– copper.hat
Jan 15 at 20:33
$begingroup$
So then the second problem has the same primal solution, but does not have a dual solution?
$endgroup$
– NeedsToKnowMoreMaths
Jan 15 at 21:24
$begingroup$
I would have to think about that. The first problem has no duality gap because of Slater. I would need to do a little work to check if strong duality holds for the second problem.
$endgroup$
– copper.hat
Jan 15 at 21:31