Optimizing over vector and Matrix at the same time.
$begingroup$
I want to know if my understand (prove convexity, format as SDP) the following problem is correct:
begin{equation*}
begin{aligned}
& min_{cin mathbb{C}^n,Din mathbb{H}_+^n} && |c|_2 + lambda cdot mathrm{Tr}(D) \
& text{subject to} && h^Tc + mathrm{Tr}(H^TD) = b\
&&& D succeq 0
end{aligned}
end{equation*}
where $D in mathbb{H}_+^n$ is a Hermitian Positve-Semidefinite Matrix.
This problem is derived from Trace Heuristic of Matrix rank-minimization, but is regularized with a vector $c$ and the equality constraint has term $c$ in it.
Proof Convexity
To show that the problem above is a minimizing a convex function over a convex set, I show that the objective is jointly convex in $(c,D)$ and the equality constraint admits a convex set over $(c,D)$. The SDP cone constraint is convex and does not need to be shown.
From here, I write $x = (c,D) in mathbb{C}^n times mathbb{H}_+^n$ and write $mathbf{X} = mathbb{C}^n times mathbb{H}_+^n$
Convex Objective
For any $theta in [0,1]$, for given $x_1=(c_1,D_1),x_2=(c_2,D_1) in mathbf{X}$, we denote convex combination as $x_3 = theta x_1 + (1-theta) x_2$, noting that $x_3 =(c_3,D_3) in mathbf{X}$.
begin{equation}
begin{split}
f(theta x_1 + (1-theta) x_2) &= |theta c_1 + (1-theta)c_2|_2 +lambda cdot mathrm{Tr}(theta D_1 + (1-theta) D_2) \
&leq theta|c_1|_2 +(1-theta)|c_2|_2 + theta lambda mathrm{Tr}(D) + (1-theta)lambdamathrm{Tr}(D) \
&= theta f(x_1) + (1+theta)f(x_2)
end{split}
end{equation}
Convex Set
Simply by the affine nature of the equality.
SDP
Let $mathbf{Y} = begin{bmatrix} diag(c) & 0 \ 0 & Dend{bmatrix}$
Re-formulate the problem as :
begin{equation*}
begin{aligned}
& min_{Yin mathbb{H}_+^{2n}} && langle begin{bmatrix} diag(c) & 0 \ 0 & lambda I_nend{bmatrix} , Yrangle_{mathbb{H}^n_+} \
& text{subject to} && langle begin{bmatrix} diag(h) & 0 \ 0 & Hend{bmatrix}, Yrangle_{mathbb{H}^n_+} =b \
&&& Y succeq 0 quad (dagger)
end{aligned}
end{equation*}
($dagger$): This inequality is stronger in the sense that it forces $Re(c) geq 0$ whereas the original formulation has no such constraint.
My Questions
- Is my proof of convexity correct?
- How to formulate the problem as an SDP? Specifically, what do with the inequality constraint?
convex-analysis convex-optimization positive-semidefinite semidefinite-programming
$endgroup$
add a comment |
$begingroup$
I want to know if my understand (prove convexity, format as SDP) the following problem is correct:
begin{equation*}
begin{aligned}
& min_{cin mathbb{C}^n,Din mathbb{H}_+^n} && |c|_2 + lambda cdot mathrm{Tr}(D) \
& text{subject to} && h^Tc + mathrm{Tr}(H^TD) = b\
&&& D succeq 0
end{aligned}
end{equation*}
where $D in mathbb{H}_+^n$ is a Hermitian Positve-Semidefinite Matrix.
This problem is derived from Trace Heuristic of Matrix rank-minimization, but is regularized with a vector $c$ and the equality constraint has term $c$ in it.
Proof Convexity
To show that the problem above is a minimizing a convex function over a convex set, I show that the objective is jointly convex in $(c,D)$ and the equality constraint admits a convex set over $(c,D)$. The SDP cone constraint is convex and does not need to be shown.
From here, I write $x = (c,D) in mathbb{C}^n times mathbb{H}_+^n$ and write $mathbf{X} = mathbb{C}^n times mathbb{H}_+^n$
Convex Objective
For any $theta in [0,1]$, for given $x_1=(c_1,D_1),x_2=(c_2,D_1) in mathbf{X}$, we denote convex combination as $x_3 = theta x_1 + (1-theta) x_2$, noting that $x_3 =(c_3,D_3) in mathbf{X}$.
begin{equation}
begin{split}
f(theta x_1 + (1-theta) x_2) &= |theta c_1 + (1-theta)c_2|_2 +lambda cdot mathrm{Tr}(theta D_1 + (1-theta) D_2) \
&leq theta|c_1|_2 +(1-theta)|c_2|_2 + theta lambda mathrm{Tr}(D) + (1-theta)lambdamathrm{Tr}(D) \
&= theta f(x_1) + (1+theta)f(x_2)
end{split}
end{equation}
Convex Set
Simply by the affine nature of the equality.
SDP
Let $mathbf{Y} = begin{bmatrix} diag(c) & 0 \ 0 & Dend{bmatrix}$
Re-formulate the problem as :
begin{equation*}
begin{aligned}
& min_{Yin mathbb{H}_+^{2n}} && langle begin{bmatrix} diag(c) & 0 \ 0 & lambda I_nend{bmatrix} , Yrangle_{mathbb{H}^n_+} \
& text{subject to} && langle begin{bmatrix} diag(h) & 0 \ 0 & Hend{bmatrix}, Yrangle_{mathbb{H}^n_+} =b \
&&& Y succeq 0 quad (dagger)
end{aligned}
end{equation*}
($dagger$): This inequality is stronger in the sense that it forces $Re(c) geq 0$ whereas the original formulation has no such constraint.
My Questions
- Is my proof of convexity correct?
- How to formulate the problem as an SDP? Specifically, what do with the inequality constraint?
convex-analysis convex-optimization positive-semidefinite semidefinite-programming
$endgroup$
$begingroup$
do you mean tr($H^TD$) in the constraint of the initial problem?
$endgroup$
– LinAlg
Jan 30 at 19:55
$begingroup$
@LinAlg Yep! Just corrected, thanks!
$endgroup$
– Tingkai Liu
Jan 30 at 20:36
add a comment |
$begingroup$
I want to know if my understand (prove convexity, format as SDP) the following problem is correct:
begin{equation*}
begin{aligned}
& min_{cin mathbb{C}^n,Din mathbb{H}_+^n} && |c|_2 + lambda cdot mathrm{Tr}(D) \
& text{subject to} && h^Tc + mathrm{Tr}(H^TD) = b\
&&& D succeq 0
end{aligned}
end{equation*}
where $D in mathbb{H}_+^n$ is a Hermitian Positve-Semidefinite Matrix.
This problem is derived from Trace Heuristic of Matrix rank-minimization, but is regularized with a vector $c$ and the equality constraint has term $c$ in it.
Proof Convexity
To show that the problem above is a minimizing a convex function over a convex set, I show that the objective is jointly convex in $(c,D)$ and the equality constraint admits a convex set over $(c,D)$. The SDP cone constraint is convex and does not need to be shown.
From here, I write $x = (c,D) in mathbb{C}^n times mathbb{H}_+^n$ and write $mathbf{X} = mathbb{C}^n times mathbb{H}_+^n$
Convex Objective
For any $theta in [0,1]$, for given $x_1=(c_1,D_1),x_2=(c_2,D_1) in mathbf{X}$, we denote convex combination as $x_3 = theta x_1 + (1-theta) x_2$, noting that $x_3 =(c_3,D_3) in mathbf{X}$.
begin{equation}
begin{split}
f(theta x_1 + (1-theta) x_2) &= |theta c_1 + (1-theta)c_2|_2 +lambda cdot mathrm{Tr}(theta D_1 + (1-theta) D_2) \
&leq theta|c_1|_2 +(1-theta)|c_2|_2 + theta lambda mathrm{Tr}(D) + (1-theta)lambdamathrm{Tr}(D) \
&= theta f(x_1) + (1+theta)f(x_2)
end{split}
end{equation}
Convex Set
Simply by the affine nature of the equality.
SDP
Let $mathbf{Y} = begin{bmatrix} diag(c) & 0 \ 0 & Dend{bmatrix}$
Re-formulate the problem as :
begin{equation*}
begin{aligned}
& min_{Yin mathbb{H}_+^{2n}} && langle begin{bmatrix} diag(c) & 0 \ 0 & lambda I_nend{bmatrix} , Yrangle_{mathbb{H}^n_+} \
& text{subject to} && langle begin{bmatrix} diag(h) & 0 \ 0 & Hend{bmatrix}, Yrangle_{mathbb{H}^n_+} =b \
&&& Y succeq 0 quad (dagger)
end{aligned}
end{equation*}
($dagger$): This inequality is stronger in the sense that it forces $Re(c) geq 0$ whereas the original formulation has no such constraint.
My Questions
- Is my proof of convexity correct?
- How to formulate the problem as an SDP? Specifically, what do with the inequality constraint?
convex-analysis convex-optimization positive-semidefinite semidefinite-programming
$endgroup$
I want to know if my understand (prove convexity, format as SDP) the following problem is correct:
begin{equation*}
begin{aligned}
& min_{cin mathbb{C}^n,Din mathbb{H}_+^n} && |c|_2 + lambda cdot mathrm{Tr}(D) \
& text{subject to} && h^Tc + mathrm{Tr}(H^TD) = b\
&&& D succeq 0
end{aligned}
end{equation*}
where $D in mathbb{H}_+^n$ is a Hermitian Positve-Semidefinite Matrix.
This problem is derived from Trace Heuristic of Matrix rank-minimization, but is regularized with a vector $c$ and the equality constraint has term $c$ in it.
Proof Convexity
To show that the problem above is a minimizing a convex function over a convex set, I show that the objective is jointly convex in $(c,D)$ and the equality constraint admits a convex set over $(c,D)$. The SDP cone constraint is convex and does not need to be shown.
From here, I write $x = (c,D) in mathbb{C}^n times mathbb{H}_+^n$ and write $mathbf{X} = mathbb{C}^n times mathbb{H}_+^n$
Convex Objective
For any $theta in [0,1]$, for given $x_1=(c_1,D_1),x_2=(c_2,D_1) in mathbf{X}$, we denote convex combination as $x_3 = theta x_1 + (1-theta) x_2$, noting that $x_3 =(c_3,D_3) in mathbf{X}$.
begin{equation}
begin{split}
f(theta x_1 + (1-theta) x_2) &= |theta c_1 + (1-theta)c_2|_2 +lambda cdot mathrm{Tr}(theta D_1 + (1-theta) D_2) \
&leq theta|c_1|_2 +(1-theta)|c_2|_2 + theta lambda mathrm{Tr}(D) + (1-theta)lambdamathrm{Tr}(D) \
&= theta f(x_1) + (1+theta)f(x_2)
end{split}
end{equation}
Convex Set
Simply by the affine nature of the equality.
SDP
Let $mathbf{Y} = begin{bmatrix} diag(c) & 0 \ 0 & Dend{bmatrix}$
Re-formulate the problem as :
begin{equation*}
begin{aligned}
& min_{Yin mathbb{H}_+^{2n}} && langle begin{bmatrix} diag(c) & 0 \ 0 & lambda I_nend{bmatrix} , Yrangle_{mathbb{H}^n_+} \
& text{subject to} && langle begin{bmatrix} diag(h) & 0 \ 0 & Hend{bmatrix}, Yrangle_{mathbb{H}^n_+} =b \
&&& Y succeq 0 quad (dagger)
end{aligned}
end{equation*}
($dagger$): This inequality is stronger in the sense that it forces $Re(c) geq 0$ whereas the original formulation has no such constraint.
My Questions
- Is my proof of convexity correct?
- How to formulate the problem as an SDP? Specifically, what do with the inequality constraint?
convex-analysis convex-optimization positive-semidefinite semidefinite-programming
convex-analysis convex-optimization positive-semidefinite semidefinite-programming
edited Jan 30 at 20:35
Tingkai Liu
asked Jan 30 at 19:51
Tingkai LiuTingkai Liu
758
758
$begingroup$
do you mean tr($H^TD$) in the constraint of the initial problem?
$endgroup$
– LinAlg
Jan 30 at 19:55
$begingroup$
@LinAlg Yep! Just corrected, thanks!
$endgroup$
– Tingkai Liu
Jan 30 at 20:36
add a comment |
$begingroup$
do you mean tr($H^TD$) in the constraint of the initial problem?
$endgroup$
– LinAlg
Jan 30 at 19:55
$begingroup$
@LinAlg Yep! Just corrected, thanks!
$endgroup$
– Tingkai Liu
Jan 30 at 20:36
$begingroup$
do you mean tr($H^TD$) in the constraint of the initial problem?
$endgroup$
– LinAlg
Jan 30 at 19:55
$begingroup$
do you mean tr($H^TD$) in the constraint of the initial problem?
$endgroup$
– LinAlg
Jan 30 at 19:55
$begingroup$
@LinAlg Yep! Just corrected, thanks!
$endgroup$
– Tingkai Liu
Jan 30 at 20:36
$begingroup$
@LinAlg Yep! Just corrected, thanks!
$endgroup$
– Tingkai Liu
Jan 30 at 20:36
add a comment |
1 Answer
1
active
oldest
votes
$begingroup$
No, you seem to think that a convex set is sufficient, but you also need the convex set to be defined by convex functions (which is the case here). For the objective you can just state that the sum of convex functions is convex.
You need to write $c$ as the difference of two nonnegative vectors (resulting in an extra block row/column in $Y$). This is the same as formulating a linear optimization problem in standard form where it initially has a free variable.
$endgroup$
$begingroup$
1. When you say a convex set defined by convex function, do you mean that for $h_i(x) = 0$, $h_i$ has to be convex in $x$? So in my case I will need to argue that $h^Tc + Tr(H^T D)$ is convex in $(c,D)$? Revisiting Boyd 4.2.1 seem to suggest that equality constraint needs to be linear in most cases, unless the problem satisfies specific conditions as in Exercise 4.6. As per objective, I've just always had trouble when thinking with multiple variables. sum of convex function is convex always look to me like $f(x) + g(x)$ over the same $x$.
$endgroup$
– Tingkai Liu
Jan 30 at 20:52
$begingroup$
equality constraints indeed need to be linear, for an inequality constraint $g_i(x)leq 0$, $g$ needs to be convex; these statements are stronger than "[the feasible region is a] Convex Set".
$endgroup$
– LinAlg
Jan 30 at 20:53
$begingroup$
Thanks! As per the second part, re-writing $c = x^{+}-x^{-}, Re(x^+) succeq 0, Re(x^-) succeq 0$ and defining $Y = begin{bmatrix} diag(x^+) & 0 & 0 \ 0 & diag(x^-) & 0 \ 0 & 0& Dend{bmatrix}$ does the trick! For proof of convexity, I'm afraid I don't understand what need to be added to the my proof. Would the statement that the equality constraint is linear suffice? For inequality $g(x) leq 0$, is the convexity required necessary because some functions can admit convex set under $g(x) leq 0$ but not under $g(x) leq alpha$ (quasiconvex)? Could you give an pathological example?
$endgroup$
– Tingkai Liu
Jan 30 at 21:08
1
$begingroup$
"Would the statement that the equality constraint is linear suffice?" yes. "Could you give an pathological example?" it can be quite simple: ${x : x^3 leq 0}$ is a convex set, but $g(x)=x^3$ is not a convex function
$endgroup$
– LinAlg
Jan 30 at 21:19
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3094023%2foptimizing-over-vector-and-matrix-at-the-same-time%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
No, you seem to think that a convex set is sufficient, but you also need the convex set to be defined by convex functions (which is the case here). For the objective you can just state that the sum of convex functions is convex.
You need to write $c$ as the difference of two nonnegative vectors (resulting in an extra block row/column in $Y$). This is the same as formulating a linear optimization problem in standard form where it initially has a free variable.
$endgroup$
$begingroup$
1. When you say a convex set defined by convex function, do you mean that for $h_i(x) = 0$, $h_i$ has to be convex in $x$? So in my case I will need to argue that $h^Tc + Tr(H^T D)$ is convex in $(c,D)$? Revisiting Boyd 4.2.1 seem to suggest that equality constraint needs to be linear in most cases, unless the problem satisfies specific conditions as in Exercise 4.6. As per objective, I've just always had trouble when thinking with multiple variables. sum of convex function is convex always look to me like $f(x) + g(x)$ over the same $x$.
$endgroup$
– Tingkai Liu
Jan 30 at 20:52
$begingroup$
equality constraints indeed need to be linear, for an inequality constraint $g_i(x)leq 0$, $g$ needs to be convex; these statements are stronger than "[the feasible region is a] Convex Set".
$endgroup$
– LinAlg
Jan 30 at 20:53
$begingroup$
Thanks! As per the second part, re-writing $c = x^{+}-x^{-}, Re(x^+) succeq 0, Re(x^-) succeq 0$ and defining $Y = begin{bmatrix} diag(x^+) & 0 & 0 \ 0 & diag(x^-) & 0 \ 0 & 0& Dend{bmatrix}$ does the trick! For proof of convexity, I'm afraid I don't understand what need to be added to the my proof. Would the statement that the equality constraint is linear suffice? For inequality $g(x) leq 0$, is the convexity required necessary because some functions can admit convex set under $g(x) leq 0$ but not under $g(x) leq alpha$ (quasiconvex)? Could you give an pathological example?
$endgroup$
– Tingkai Liu
Jan 30 at 21:08
1
$begingroup$
"Would the statement that the equality constraint is linear suffice?" yes. "Could you give an pathological example?" it can be quite simple: ${x : x^3 leq 0}$ is a convex set, but $g(x)=x^3$ is not a convex function
$endgroup$
– LinAlg
Jan 30 at 21:19
add a comment |
$begingroup$
No, you seem to think that a convex set is sufficient, but you also need the convex set to be defined by convex functions (which is the case here). For the objective you can just state that the sum of convex functions is convex.
You need to write $c$ as the difference of two nonnegative vectors (resulting in an extra block row/column in $Y$). This is the same as formulating a linear optimization problem in standard form where it initially has a free variable.
$endgroup$
$begingroup$
1. When you say a convex set defined by convex function, do you mean that for $h_i(x) = 0$, $h_i$ has to be convex in $x$? So in my case I will need to argue that $h^Tc + Tr(H^T D)$ is convex in $(c,D)$? Revisiting Boyd 4.2.1 seem to suggest that equality constraint needs to be linear in most cases, unless the problem satisfies specific conditions as in Exercise 4.6. As per objective, I've just always had trouble when thinking with multiple variables. sum of convex function is convex always look to me like $f(x) + g(x)$ over the same $x$.
$endgroup$
– Tingkai Liu
Jan 30 at 20:52
$begingroup$
equality constraints indeed need to be linear, for an inequality constraint $g_i(x)leq 0$, $g$ needs to be convex; these statements are stronger than "[the feasible region is a] Convex Set".
$endgroup$
– LinAlg
Jan 30 at 20:53
$begingroup$
Thanks! As per the second part, re-writing $c = x^{+}-x^{-}, Re(x^+) succeq 0, Re(x^-) succeq 0$ and defining $Y = begin{bmatrix} diag(x^+) & 0 & 0 \ 0 & diag(x^-) & 0 \ 0 & 0& Dend{bmatrix}$ does the trick! For proof of convexity, I'm afraid I don't understand what need to be added to the my proof. Would the statement that the equality constraint is linear suffice? For inequality $g(x) leq 0$, is the convexity required necessary because some functions can admit convex set under $g(x) leq 0$ but not under $g(x) leq alpha$ (quasiconvex)? Could you give an pathological example?
$endgroup$
– Tingkai Liu
Jan 30 at 21:08
1
$begingroup$
"Would the statement that the equality constraint is linear suffice?" yes. "Could you give an pathological example?" it can be quite simple: ${x : x^3 leq 0}$ is a convex set, but $g(x)=x^3$ is not a convex function
$endgroup$
– LinAlg
Jan 30 at 21:19
add a comment |
$begingroup$
No, you seem to think that a convex set is sufficient, but you also need the convex set to be defined by convex functions (which is the case here). For the objective you can just state that the sum of convex functions is convex.
You need to write $c$ as the difference of two nonnegative vectors (resulting in an extra block row/column in $Y$). This is the same as formulating a linear optimization problem in standard form where it initially has a free variable.
$endgroup$
No, you seem to think that a convex set is sufficient, but you also need the convex set to be defined by convex functions (which is the case here). For the objective you can just state that the sum of convex functions is convex.
You need to write $c$ as the difference of two nonnegative vectors (resulting in an extra block row/column in $Y$). This is the same as formulating a linear optimization problem in standard form where it initially has a free variable.
answered Jan 30 at 19:59
LinAlgLinAlg
10.1k1521
10.1k1521
$begingroup$
1. When you say a convex set defined by convex function, do you mean that for $h_i(x) = 0$, $h_i$ has to be convex in $x$? So in my case I will need to argue that $h^Tc + Tr(H^T D)$ is convex in $(c,D)$? Revisiting Boyd 4.2.1 seem to suggest that equality constraint needs to be linear in most cases, unless the problem satisfies specific conditions as in Exercise 4.6. As per objective, I've just always had trouble when thinking with multiple variables. sum of convex function is convex always look to me like $f(x) + g(x)$ over the same $x$.
$endgroup$
– Tingkai Liu
Jan 30 at 20:52
$begingroup$
equality constraints indeed need to be linear, for an inequality constraint $g_i(x)leq 0$, $g$ needs to be convex; these statements are stronger than "[the feasible region is a] Convex Set".
$endgroup$
– LinAlg
Jan 30 at 20:53
$begingroup$
Thanks! As per the second part, re-writing $c = x^{+}-x^{-}, Re(x^+) succeq 0, Re(x^-) succeq 0$ and defining $Y = begin{bmatrix} diag(x^+) & 0 & 0 \ 0 & diag(x^-) & 0 \ 0 & 0& Dend{bmatrix}$ does the trick! For proof of convexity, I'm afraid I don't understand what need to be added to the my proof. Would the statement that the equality constraint is linear suffice? For inequality $g(x) leq 0$, is the convexity required necessary because some functions can admit convex set under $g(x) leq 0$ but not under $g(x) leq alpha$ (quasiconvex)? Could you give an pathological example?
$endgroup$
– Tingkai Liu
Jan 30 at 21:08
1
$begingroup$
"Would the statement that the equality constraint is linear suffice?" yes. "Could you give an pathological example?" it can be quite simple: ${x : x^3 leq 0}$ is a convex set, but $g(x)=x^3$ is not a convex function
$endgroup$
– LinAlg
Jan 30 at 21:19
add a comment |
$begingroup$
1. When you say a convex set defined by convex function, do you mean that for $h_i(x) = 0$, $h_i$ has to be convex in $x$? So in my case I will need to argue that $h^Tc + Tr(H^T D)$ is convex in $(c,D)$? Revisiting Boyd 4.2.1 seem to suggest that equality constraint needs to be linear in most cases, unless the problem satisfies specific conditions as in Exercise 4.6. As per objective, I've just always had trouble when thinking with multiple variables. sum of convex function is convex always look to me like $f(x) + g(x)$ over the same $x$.
$endgroup$
– Tingkai Liu
Jan 30 at 20:52
$begingroup$
equality constraints indeed need to be linear, for an inequality constraint $g_i(x)leq 0$, $g$ needs to be convex; these statements are stronger than "[the feasible region is a] Convex Set".
$endgroup$
– LinAlg
Jan 30 at 20:53
$begingroup$
Thanks! As per the second part, re-writing $c = x^{+}-x^{-}, Re(x^+) succeq 0, Re(x^-) succeq 0$ and defining $Y = begin{bmatrix} diag(x^+) & 0 & 0 \ 0 & diag(x^-) & 0 \ 0 & 0& Dend{bmatrix}$ does the trick! For proof of convexity, I'm afraid I don't understand what need to be added to the my proof. Would the statement that the equality constraint is linear suffice? For inequality $g(x) leq 0$, is the convexity required necessary because some functions can admit convex set under $g(x) leq 0$ but not under $g(x) leq alpha$ (quasiconvex)? Could you give an pathological example?
$endgroup$
– Tingkai Liu
Jan 30 at 21:08
1
$begingroup$
"Would the statement that the equality constraint is linear suffice?" yes. "Could you give an pathological example?" it can be quite simple: ${x : x^3 leq 0}$ is a convex set, but $g(x)=x^3$ is not a convex function
$endgroup$
– LinAlg
Jan 30 at 21:19
$begingroup$
1. When you say a convex set defined by convex function, do you mean that for $h_i(x) = 0$, $h_i$ has to be convex in $x$? So in my case I will need to argue that $h^Tc + Tr(H^T D)$ is convex in $(c,D)$? Revisiting Boyd 4.2.1 seem to suggest that equality constraint needs to be linear in most cases, unless the problem satisfies specific conditions as in Exercise 4.6. As per objective, I've just always had trouble when thinking with multiple variables. sum of convex function is convex always look to me like $f(x) + g(x)$ over the same $x$.
$endgroup$
– Tingkai Liu
Jan 30 at 20:52
$begingroup$
1. When you say a convex set defined by convex function, do you mean that for $h_i(x) = 0$, $h_i$ has to be convex in $x$? So in my case I will need to argue that $h^Tc + Tr(H^T D)$ is convex in $(c,D)$? Revisiting Boyd 4.2.1 seem to suggest that equality constraint needs to be linear in most cases, unless the problem satisfies specific conditions as in Exercise 4.6. As per objective, I've just always had trouble when thinking with multiple variables. sum of convex function is convex always look to me like $f(x) + g(x)$ over the same $x$.
$endgroup$
– Tingkai Liu
Jan 30 at 20:52
$begingroup$
equality constraints indeed need to be linear, for an inequality constraint $g_i(x)leq 0$, $g$ needs to be convex; these statements are stronger than "[the feasible region is a] Convex Set".
$endgroup$
– LinAlg
Jan 30 at 20:53
$begingroup$
equality constraints indeed need to be linear, for an inequality constraint $g_i(x)leq 0$, $g$ needs to be convex; these statements are stronger than "[the feasible region is a] Convex Set".
$endgroup$
– LinAlg
Jan 30 at 20:53
$begingroup$
Thanks! As per the second part, re-writing $c = x^{+}-x^{-}, Re(x^+) succeq 0, Re(x^-) succeq 0$ and defining $Y = begin{bmatrix} diag(x^+) & 0 & 0 \ 0 & diag(x^-) & 0 \ 0 & 0& Dend{bmatrix}$ does the trick! For proof of convexity, I'm afraid I don't understand what need to be added to the my proof. Would the statement that the equality constraint is linear suffice? For inequality $g(x) leq 0$, is the convexity required necessary because some functions can admit convex set under $g(x) leq 0$ but not under $g(x) leq alpha$ (quasiconvex)? Could you give an pathological example?
$endgroup$
– Tingkai Liu
Jan 30 at 21:08
$begingroup$
Thanks! As per the second part, re-writing $c = x^{+}-x^{-}, Re(x^+) succeq 0, Re(x^-) succeq 0$ and defining $Y = begin{bmatrix} diag(x^+) & 0 & 0 \ 0 & diag(x^-) & 0 \ 0 & 0& Dend{bmatrix}$ does the trick! For proof of convexity, I'm afraid I don't understand what need to be added to the my proof. Would the statement that the equality constraint is linear suffice? For inequality $g(x) leq 0$, is the convexity required necessary because some functions can admit convex set under $g(x) leq 0$ but not under $g(x) leq alpha$ (quasiconvex)? Could you give an pathological example?
$endgroup$
– Tingkai Liu
Jan 30 at 21:08
1
1
$begingroup$
"Would the statement that the equality constraint is linear suffice?" yes. "Could you give an pathological example?" it can be quite simple: ${x : x^3 leq 0}$ is a convex set, but $g(x)=x^3$ is not a convex function
$endgroup$
– LinAlg
Jan 30 at 21:19
$begingroup$
"Would the statement that the equality constraint is linear suffice?" yes. "Could you give an pathological example?" it can be quite simple: ${x : x^3 leq 0}$ is a convex set, but $g(x)=x^3$ is not a convex function
$endgroup$
– LinAlg
Jan 30 at 21:19
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3094023%2foptimizing-over-vector-and-matrix-at-the-same-time%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
$begingroup$
do you mean tr($H^TD$) in the constraint of the initial problem?
$endgroup$
– LinAlg
Jan 30 at 19:55
$begingroup$
@LinAlg Yep! Just corrected, thanks!
$endgroup$
– Tingkai Liu
Jan 30 at 20:36