A condition for 'similarity' of subgradients of convex proper functions
up vote
0
down vote
favorite
Motivated by the answer in Under which conditions does small uniform norm imply 'similarity' of subgradients. I have the following question.
Let $mathcal{S}$ be the set of proper convex functions functions from $X$ to $mathbb{R}$, where $X$ is a open and convex subset of $mathbb{R}^{n}$. I was wondering under which conditions on $mathcal{S}$ we have
begin{gather}
forall epsilon>0 exists delta>0 text{ such that for } f,h in mathcal{S} , ||f-h||_{infty} < delta \
Rightarrow underset{ x in X}{sup} underset{ v in partial f(x), w in partial h(x)}{inf}||v-w||_2 <epsilon
end{gather}
Motivation: Intuitively, the fact that $||f-h||_{infty}$ is small, means that the shape of the graphs of the functions are similar and hence also their suporting hyperplanes might be similar. Of course this is just a picture that I have in mind for the 1-dimensional case.
From where the problemm comes: I have a function $F$ that maps convex functions to elements of their subgradient at any given point. I would like to show that if the mapped functions are similar, i.e. $||f-h||_{infty}$ is sufficiently small, then we can say that $||F(h)-F(f)||_{2}$ is small.
Some references: A similar question was asked here [Hausdorff Distance between Subdifferential sets. The problem is that the theory defines more general types of convergences and analyzes the convergence of the subdifferentials in that framework. I have almost no background in functional analysis so I would really just like to know under which conditions my type of 'convergence'; holds, as defined above.
functional-analysis convex-analysis convex-optimization
add a comment |
up vote
0
down vote
favorite
Motivated by the answer in Under which conditions does small uniform norm imply 'similarity' of subgradients. I have the following question.
Let $mathcal{S}$ be the set of proper convex functions functions from $X$ to $mathbb{R}$, where $X$ is a open and convex subset of $mathbb{R}^{n}$. I was wondering under which conditions on $mathcal{S}$ we have
begin{gather}
forall epsilon>0 exists delta>0 text{ such that for } f,h in mathcal{S} , ||f-h||_{infty} < delta \
Rightarrow underset{ x in X}{sup} underset{ v in partial f(x), w in partial h(x)}{inf}||v-w||_2 <epsilon
end{gather}
Motivation: Intuitively, the fact that $||f-h||_{infty}$ is small, means that the shape of the graphs of the functions are similar and hence also their suporting hyperplanes might be similar. Of course this is just a picture that I have in mind for the 1-dimensional case.
From where the problemm comes: I have a function $F$ that maps convex functions to elements of their subgradient at any given point. I would like to show that if the mapped functions are similar, i.e. $||f-h||_{infty}$ is sufficiently small, then we can say that $||F(h)-F(f)||_{2}$ is small.
Some references: A similar question was asked here [Hausdorff Distance between Subdifferential sets. The problem is that the theory defines more general types of convergences and analyzes the convergence of the subdifferentials in that framework. I have almost no background in functional analysis so I would really just like to know under which conditions my type of 'convergence'; holds, as defined above.
functional-analysis convex-analysis convex-optimization
add a comment |
up vote
0
down vote
favorite
up vote
0
down vote
favorite
Motivated by the answer in Under which conditions does small uniform norm imply 'similarity' of subgradients. I have the following question.
Let $mathcal{S}$ be the set of proper convex functions functions from $X$ to $mathbb{R}$, where $X$ is a open and convex subset of $mathbb{R}^{n}$. I was wondering under which conditions on $mathcal{S}$ we have
begin{gather}
forall epsilon>0 exists delta>0 text{ such that for } f,h in mathcal{S} , ||f-h||_{infty} < delta \
Rightarrow underset{ x in X}{sup} underset{ v in partial f(x), w in partial h(x)}{inf}||v-w||_2 <epsilon
end{gather}
Motivation: Intuitively, the fact that $||f-h||_{infty}$ is small, means that the shape of the graphs of the functions are similar and hence also their suporting hyperplanes might be similar. Of course this is just a picture that I have in mind for the 1-dimensional case.
From where the problemm comes: I have a function $F$ that maps convex functions to elements of their subgradient at any given point. I would like to show that if the mapped functions are similar, i.e. $||f-h||_{infty}$ is sufficiently small, then we can say that $||F(h)-F(f)||_{2}$ is small.
Some references: A similar question was asked here [Hausdorff Distance between Subdifferential sets. The problem is that the theory defines more general types of convergences and analyzes the convergence of the subdifferentials in that framework. I have almost no background in functional analysis so I would really just like to know under which conditions my type of 'convergence'; holds, as defined above.
functional-analysis convex-analysis convex-optimization
Motivated by the answer in Under which conditions does small uniform norm imply 'similarity' of subgradients. I have the following question.
Let $mathcal{S}$ be the set of proper convex functions functions from $X$ to $mathbb{R}$, where $X$ is a open and convex subset of $mathbb{R}^{n}$. I was wondering under which conditions on $mathcal{S}$ we have
begin{gather}
forall epsilon>0 exists delta>0 text{ such that for } f,h in mathcal{S} , ||f-h||_{infty} < delta \
Rightarrow underset{ x in X}{sup} underset{ v in partial f(x), w in partial h(x)}{inf}||v-w||_2 <epsilon
end{gather}
Motivation: Intuitively, the fact that $||f-h||_{infty}$ is small, means that the shape of the graphs of the functions are similar and hence also their suporting hyperplanes might be similar. Of course this is just a picture that I have in mind for the 1-dimensional case.
From where the problemm comes: I have a function $F$ that maps convex functions to elements of their subgradient at any given point. I would like to show that if the mapped functions are similar, i.e. $||f-h||_{infty}$ is sufficiently small, then we can say that $||F(h)-F(f)||_{2}$ is small.
Some references: A similar question was asked here [Hausdorff Distance between Subdifferential sets. The problem is that the theory defines more general types of convergences and analyzes the convergence of the subdifferentials in that framework. I have almost no background in functional analysis so I would really just like to know under which conditions my type of 'convergence'; holds, as defined above.
functional-analysis convex-analysis convex-optimization
functional-analysis convex-analysis convex-optimization
asked 15 hours ago
sigmatau
1,8791923
1,8791923
add a comment |
add a comment |
active
oldest
votes
active
oldest
votes
active
oldest
votes
active
oldest
votes
active
oldest
votes
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3004735%2fa-condition-for-similarity-of-subgradients-of-convex-proper-functions%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown