$f_n$ converge uniformly to $f$ then $mathrm{d}f_n(x_n)$ converges to $mathrm{d}f(x)$












6












$begingroup$



Let $f_n : mathbb{R}^p to mathbb{R}$ such that the $f_n$ are $C^1$ and such that the sequence $(f_n)_{n in mathbb{N}}$ converges uniformly to a function $f : mathbb{R}^p to mathbb{R}$ which is $C^1$. Then prove that for all $x in mathbb{R}^n$ there is a sequence $(x_n)_{n in mathbb{N}}$ which converge to $x$ such that $mathrm{d}f_n(x_n)$ converges to $mathrm{d}f(x)$.




I must say that I don't know at all how to do and don't have any intuition of what is really going on here. So we might look at the case qhere $p= 1$.



So we can write :



$$f(a+h) = f(a)+ f'(a)h +o(h)$$
$$forall n in mathbb{N}, f_n(a+h) = f_n(a) + f'_n(a)h +o(h)$$



Hence we have :



$$mid f'(a)h - f'_n(a)h mid leq mid f(a+h)-f(a) mid +mid f(a)-f_n(a) mid + mid o(h) mid$$



Since the function $f_n$ converge uniformly to $f$, we have :
$$mid f'(a)h - f'_{infty}(a) mid leq mid o(h) mid$$
And now using we let $h to 0$ so that :



$$mid f'(a) -f_infty'(a) mid = 0 $$



I don't know if this works, but it feels strange to me since in the case the sequence $x_n$ is just the constant sequence... and moreover if this is correct I don't see at all how to generalise to higher dimensions.



Thank you !










share|cite|improve this question









$endgroup$












  • $begingroup$
    Your demonstration is not correct, since you don’t know if $lim_{n to infty} f’_n(a)$ converges, hence you can note it like $f’_{infty}(a)$, that’s why you always need to be careful when dealing with double limit
    $endgroup$
    – Thinking
    Jan 28 at 23:42






  • 3




    $begingroup$
    For complex or vector valued functions the claim is wrong. Consider the sequence $f_n(x):={1over n}e^{inx}$, which converges uniformly to $0$, but $|f_n'(x)|=1$ for all $x$ and $n$.
    $endgroup$
    – Christian Blatter
    Jan 31 at 10:34
















6












$begingroup$



Let $f_n : mathbb{R}^p to mathbb{R}$ such that the $f_n$ are $C^1$ and such that the sequence $(f_n)_{n in mathbb{N}}$ converges uniformly to a function $f : mathbb{R}^p to mathbb{R}$ which is $C^1$. Then prove that for all $x in mathbb{R}^n$ there is a sequence $(x_n)_{n in mathbb{N}}$ which converge to $x$ such that $mathrm{d}f_n(x_n)$ converges to $mathrm{d}f(x)$.




I must say that I don't know at all how to do and don't have any intuition of what is really going on here. So we might look at the case qhere $p= 1$.



So we can write :



$$f(a+h) = f(a)+ f'(a)h +o(h)$$
$$forall n in mathbb{N}, f_n(a+h) = f_n(a) + f'_n(a)h +o(h)$$



Hence we have :



$$mid f'(a)h - f'_n(a)h mid leq mid f(a+h)-f(a) mid +mid f(a)-f_n(a) mid + mid o(h) mid$$



Since the function $f_n$ converge uniformly to $f$, we have :
$$mid f'(a)h - f'_{infty}(a) mid leq mid o(h) mid$$
And now using we let $h to 0$ so that :



$$mid f'(a) -f_infty'(a) mid = 0 $$



I don't know if this works, but it feels strange to me since in the case the sequence $x_n$ is just the constant sequence... and moreover if this is correct I don't see at all how to generalise to higher dimensions.



Thank you !










share|cite|improve this question









$endgroup$












  • $begingroup$
    Your demonstration is not correct, since you don’t know if $lim_{n to infty} f’_n(a)$ converges, hence you can note it like $f’_{infty}(a)$, that’s why you always need to be careful when dealing with double limit
    $endgroup$
    – Thinking
    Jan 28 at 23:42






  • 3




    $begingroup$
    For complex or vector valued functions the claim is wrong. Consider the sequence $f_n(x):={1over n}e^{inx}$, which converges uniformly to $0$, but $|f_n'(x)|=1$ for all $x$ and $n$.
    $endgroup$
    – Christian Blatter
    Jan 31 at 10:34














6












6








6


3



$begingroup$



Let $f_n : mathbb{R}^p to mathbb{R}$ such that the $f_n$ are $C^1$ and such that the sequence $(f_n)_{n in mathbb{N}}$ converges uniformly to a function $f : mathbb{R}^p to mathbb{R}$ which is $C^1$. Then prove that for all $x in mathbb{R}^n$ there is a sequence $(x_n)_{n in mathbb{N}}$ which converge to $x$ such that $mathrm{d}f_n(x_n)$ converges to $mathrm{d}f(x)$.




I must say that I don't know at all how to do and don't have any intuition of what is really going on here. So we might look at the case qhere $p= 1$.



So we can write :



$$f(a+h) = f(a)+ f'(a)h +o(h)$$
$$forall n in mathbb{N}, f_n(a+h) = f_n(a) + f'_n(a)h +o(h)$$



Hence we have :



$$mid f'(a)h - f'_n(a)h mid leq mid f(a+h)-f(a) mid +mid f(a)-f_n(a) mid + mid o(h) mid$$



Since the function $f_n$ converge uniformly to $f$, we have :
$$mid f'(a)h - f'_{infty}(a) mid leq mid o(h) mid$$
And now using we let $h to 0$ so that :



$$mid f'(a) -f_infty'(a) mid = 0 $$



I don't know if this works, but it feels strange to me since in the case the sequence $x_n$ is just the constant sequence... and moreover if this is correct I don't see at all how to generalise to higher dimensions.



Thank you !










share|cite|improve this question









$endgroup$





Let $f_n : mathbb{R}^p to mathbb{R}$ such that the $f_n$ are $C^1$ and such that the sequence $(f_n)_{n in mathbb{N}}$ converges uniformly to a function $f : mathbb{R}^p to mathbb{R}$ which is $C^1$. Then prove that for all $x in mathbb{R}^n$ there is a sequence $(x_n)_{n in mathbb{N}}$ which converge to $x$ such that $mathrm{d}f_n(x_n)$ converges to $mathrm{d}f(x)$.




I must say that I don't know at all how to do and don't have any intuition of what is really going on here. So we might look at the case qhere $p= 1$.



So we can write :



$$f(a+h) = f(a)+ f'(a)h +o(h)$$
$$forall n in mathbb{N}, f_n(a+h) = f_n(a) + f'_n(a)h +o(h)$$



Hence we have :



$$mid f'(a)h - f'_n(a)h mid leq mid f(a+h)-f(a) mid +mid f(a)-f_n(a) mid + mid o(h) mid$$



Since the function $f_n$ converge uniformly to $f$, we have :
$$mid f'(a)h - f'_{infty}(a) mid leq mid o(h) mid$$
And now using we let $h to 0$ so that :



$$mid f'(a) -f_infty'(a) mid = 0 $$



I don't know if this works, but it feels strange to me since in the case the sequence $x_n$ is just the constant sequence... and moreover if this is correct I don't see at all how to generalise to higher dimensions.



Thank you !







real-analysis calculus sequences-and-series multivariable-calculus uniform-convergence






share|cite|improve this question













share|cite|improve this question











share|cite|improve this question




share|cite|improve this question










asked Jan 28 at 23:30









dghkgfzyukzdghkgfzyukz

16612




16612












  • $begingroup$
    Your demonstration is not correct, since you don’t know if $lim_{n to infty} f’_n(a)$ converges, hence you can note it like $f’_{infty}(a)$, that’s why you always need to be careful when dealing with double limit
    $endgroup$
    – Thinking
    Jan 28 at 23:42






  • 3




    $begingroup$
    For complex or vector valued functions the claim is wrong. Consider the sequence $f_n(x):={1over n}e^{inx}$, which converges uniformly to $0$, but $|f_n'(x)|=1$ for all $x$ and $n$.
    $endgroup$
    – Christian Blatter
    Jan 31 at 10:34


















  • $begingroup$
    Your demonstration is not correct, since you don’t know if $lim_{n to infty} f’_n(a)$ converges, hence you can note it like $f’_{infty}(a)$, that’s why you always need to be careful when dealing with double limit
    $endgroup$
    – Thinking
    Jan 28 at 23:42






  • 3




    $begingroup$
    For complex or vector valued functions the claim is wrong. Consider the sequence $f_n(x):={1over n}e^{inx}$, which converges uniformly to $0$, but $|f_n'(x)|=1$ for all $x$ and $n$.
    $endgroup$
    – Christian Blatter
    Jan 31 at 10:34
















$begingroup$
Your demonstration is not correct, since you don’t know if $lim_{n to infty} f’_n(a)$ converges, hence you can note it like $f’_{infty}(a)$, that’s why you always need to be careful when dealing with double limit
$endgroup$
– Thinking
Jan 28 at 23:42




$begingroup$
Your demonstration is not correct, since you don’t know if $lim_{n to infty} f’_n(a)$ converges, hence you can note it like $f’_{infty}(a)$, that’s why you always need to be careful when dealing with double limit
$endgroup$
– Thinking
Jan 28 at 23:42




3




3




$begingroup$
For complex or vector valued functions the claim is wrong. Consider the sequence $f_n(x):={1over n}e^{inx}$, which converges uniformly to $0$, but $|f_n'(x)|=1$ for all $x$ and $n$.
$endgroup$
– Christian Blatter
Jan 31 at 10:34




$begingroup$
For complex or vector valued functions the claim is wrong. Consider the sequence $f_n(x):={1over n}e^{inx}$, which converges uniformly to $0$, but $|f_n'(x)|=1$ for all $x$ and $n$.
$endgroup$
– Christian Blatter
Jan 31 at 10:34










2 Answers
2






active

oldest

votes


















6





+50







$begingroup$

We first prove the result in the case where there is a local maximum. At a maximum, the differential vanishes, so the claim is here:



Lemma. If $(g_n)_n$ is a sequence of $C^1$ functions $mathbb{R}^pto mathbb{R}$ converging uniformly to a $C^1$ function $g$ having a local (strict) maximum at $y$, that is $g(x)<g(y)$ for all $xneq y$ in a ball neigborhood $B(y,r)$ of $y$, then there exist a sequence $x_n$ such that $lim x_n=y$ and $dg_n(x_n)=0$ for sufficiently large $n$.



Proof of the Lemma: Pick $N$ sufficiently large so that $forall ngeq N$,
$$sup_{ |x-y|=r} g_n(x)<g_n(y).$$
The existence of such an $N$ follows from the fact the corresponding inequality is true for $g$ by hypothesis, and the uniform convergence of $g_n$.
For any $ngeq N$, pick $x_n$ to be a maximum of $g_n$ on $B(y,r)$. Because of the previous inequality, $x_n$ is in the interior of the ball $B(y,r)$. So the derivative satisfies $dg_n(x_n)=0$.



Let $x$ be a limit point of a subsequence of $(x_n)$. Since $g_n(x_n)geq g_n(y)$ by definition of $x_n$, taking the limit we have $g(x)geq g(y)$, and of course $x_n$ is still in the closed ball $B(y,r)$. So necessarily $x=y$ since $y$ is a local strict maximum of $g$ on $B(y,r)$. This concludes the proof of the Lemma.



Now, to deal with the general case, pick a point $y$ and define
$$g_n(x)=f_n(x)-f(x)-|x-y|^2.$$
Clearly, this sequence of $C^1$ functions converges to $g(x)=-|x-y|^2.$
We now apply the Lemma, so there is a sequence $(x_n)_n$ such that $lim x_n=y$, and
$dg_n(x_n)=0$. But since
$$dg_n(x_n).h=df_n(x_n).h-df(x_n).h-2langle x_n-y, h rangle,$$
so
$$df_n(x_n)=df(x_n)+ 2langle x_n-y, . rangle.$$
The linear forms $hmapsto 2langle x_n-y, h rangle$ converges to zero because of Cauchy-Scharwz inequality, and $df(x_n)$ converges to $df(y)$ since $f$ is $C^1$. This concludes the proof.






share|cite|improve this answer









$endgroup$













  • $begingroup$
    This is clever ! I am wondering why it's actually simpler to prove the problem when there is a local minima/maxima. I guess it's because in this case we know what is the value $mathrm{d}f$ at this point. I'll deliver the bounty tonight. Thank you !
    $endgroup$
    – dghkgfzyukz
    Jan 31 at 18:59



















1












$begingroup$

In one dimension we don't need uniform convergence; pointwise converge will do. Also we only need $f$ and $f_n, n=1,2,dots$ differentiable everywhere, not necessarily $C^1.$



WLOG we can assume $fequiv 0$ because $f_n(x)-f(x)to 0$ everywhere and $f_n-f$ is differentiable everywhere.



Fix $xin mathbb R.$ Let $delta > 0.$ Then for $nin mathbb N$ the MVT shows there exists $c(n,delta)in (x,x+delta)$ such that



$$f_n(x+delta)- f_n(x) = f_n'(c(n,delta))delta.$$



Since the left side $to 0$ as $nto infty,$ we can make the right side as small as we like by taking $n$ large. We can thus find $N = N_delta$ such that $nge N_delta$ implies $|f_n'(c(n,delta))| < delta.$



Now think of $delta_k = 1/k, k=1,2,dots .$ Then from the above there exist integers $0<N_1<N_2 < cdots$ such that $N_kle n < N_{k+1}$ implies $|f_n'(c(n,1/k))| < 1/k.$ If we then define



$$x_n = c(n,1/k),,, N_kle n N_{k+1},$$



we have $x_nto 0$ and $f'(x_n)to 0.$ (We can let $x_n$ be anything for $1le n <N_1.$)






share|cite|improve this answer









$endgroup$













  • $begingroup$
    You can't say WLOG $f = 0$ since at the beginning you are assuming $f$ is not necessarily $C^1$.
    $endgroup$
    – Thinking
    Feb 5 at 19:02










  • $begingroup$
    @Thinking I don't understand your comment.
    $endgroup$
    – zhw.
    Feb 5 at 21:45










  • $begingroup$
    It's not so clear for me that you are not loosing generality by saying $f = 0$, since $f$ is not necessarily $C^1$.
    $endgroup$
    – Thinking
    Feb 5 at 22:18










  • $begingroup$
    @Thinking Like I wrote, I am assuming $f$ and $f_n, n=1,2,dots$ are differentiable everywhere. Where do you think the proof goes wrong?
    $endgroup$
    – zhw.
    Feb 5 at 22:22














Your Answer





StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");

StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














draft saved

draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3091545%2ff-n-converge-uniformly-to-f-then-mathrmdf-nx-n-converges-to-mathrm%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown

























2 Answers
2






active

oldest

votes








2 Answers
2






active

oldest

votes









active

oldest

votes






active

oldest

votes









6





+50







$begingroup$

We first prove the result in the case where there is a local maximum. At a maximum, the differential vanishes, so the claim is here:



Lemma. If $(g_n)_n$ is a sequence of $C^1$ functions $mathbb{R}^pto mathbb{R}$ converging uniformly to a $C^1$ function $g$ having a local (strict) maximum at $y$, that is $g(x)<g(y)$ for all $xneq y$ in a ball neigborhood $B(y,r)$ of $y$, then there exist a sequence $x_n$ such that $lim x_n=y$ and $dg_n(x_n)=0$ for sufficiently large $n$.



Proof of the Lemma: Pick $N$ sufficiently large so that $forall ngeq N$,
$$sup_{ |x-y|=r} g_n(x)<g_n(y).$$
The existence of such an $N$ follows from the fact the corresponding inequality is true for $g$ by hypothesis, and the uniform convergence of $g_n$.
For any $ngeq N$, pick $x_n$ to be a maximum of $g_n$ on $B(y,r)$. Because of the previous inequality, $x_n$ is in the interior of the ball $B(y,r)$. So the derivative satisfies $dg_n(x_n)=0$.



Let $x$ be a limit point of a subsequence of $(x_n)$. Since $g_n(x_n)geq g_n(y)$ by definition of $x_n$, taking the limit we have $g(x)geq g(y)$, and of course $x_n$ is still in the closed ball $B(y,r)$. So necessarily $x=y$ since $y$ is a local strict maximum of $g$ on $B(y,r)$. This concludes the proof of the Lemma.



Now, to deal with the general case, pick a point $y$ and define
$$g_n(x)=f_n(x)-f(x)-|x-y|^2.$$
Clearly, this sequence of $C^1$ functions converges to $g(x)=-|x-y|^2.$
We now apply the Lemma, so there is a sequence $(x_n)_n$ such that $lim x_n=y$, and
$dg_n(x_n)=0$. But since
$$dg_n(x_n).h=df_n(x_n).h-df(x_n).h-2langle x_n-y, h rangle,$$
so
$$df_n(x_n)=df(x_n)+ 2langle x_n-y, . rangle.$$
The linear forms $hmapsto 2langle x_n-y, h rangle$ converges to zero because of Cauchy-Scharwz inequality, and $df(x_n)$ converges to $df(y)$ since $f$ is $C^1$. This concludes the proof.






share|cite|improve this answer









$endgroup$













  • $begingroup$
    This is clever ! I am wondering why it's actually simpler to prove the problem when there is a local minima/maxima. I guess it's because in this case we know what is the value $mathrm{d}f$ at this point. I'll deliver the bounty tonight. Thank you !
    $endgroup$
    – dghkgfzyukz
    Jan 31 at 18:59
















6





+50







$begingroup$

We first prove the result in the case where there is a local maximum. At a maximum, the differential vanishes, so the claim is here:



Lemma. If $(g_n)_n$ is a sequence of $C^1$ functions $mathbb{R}^pto mathbb{R}$ converging uniformly to a $C^1$ function $g$ having a local (strict) maximum at $y$, that is $g(x)<g(y)$ for all $xneq y$ in a ball neigborhood $B(y,r)$ of $y$, then there exist a sequence $x_n$ such that $lim x_n=y$ and $dg_n(x_n)=0$ for sufficiently large $n$.



Proof of the Lemma: Pick $N$ sufficiently large so that $forall ngeq N$,
$$sup_{ |x-y|=r} g_n(x)<g_n(y).$$
The existence of such an $N$ follows from the fact the corresponding inequality is true for $g$ by hypothesis, and the uniform convergence of $g_n$.
For any $ngeq N$, pick $x_n$ to be a maximum of $g_n$ on $B(y,r)$. Because of the previous inequality, $x_n$ is in the interior of the ball $B(y,r)$. So the derivative satisfies $dg_n(x_n)=0$.



Let $x$ be a limit point of a subsequence of $(x_n)$. Since $g_n(x_n)geq g_n(y)$ by definition of $x_n$, taking the limit we have $g(x)geq g(y)$, and of course $x_n$ is still in the closed ball $B(y,r)$. So necessarily $x=y$ since $y$ is a local strict maximum of $g$ on $B(y,r)$. This concludes the proof of the Lemma.



Now, to deal with the general case, pick a point $y$ and define
$$g_n(x)=f_n(x)-f(x)-|x-y|^2.$$
Clearly, this sequence of $C^1$ functions converges to $g(x)=-|x-y|^2.$
We now apply the Lemma, so there is a sequence $(x_n)_n$ such that $lim x_n=y$, and
$dg_n(x_n)=0$. But since
$$dg_n(x_n).h=df_n(x_n).h-df(x_n).h-2langle x_n-y, h rangle,$$
so
$$df_n(x_n)=df(x_n)+ 2langle x_n-y, . rangle.$$
The linear forms $hmapsto 2langle x_n-y, h rangle$ converges to zero because of Cauchy-Scharwz inequality, and $df(x_n)$ converges to $df(y)$ since $f$ is $C^1$. This concludes the proof.






share|cite|improve this answer









$endgroup$













  • $begingroup$
    This is clever ! I am wondering why it's actually simpler to prove the problem when there is a local minima/maxima. I guess it's because in this case we know what is the value $mathrm{d}f$ at this point. I'll deliver the bounty tonight. Thank you !
    $endgroup$
    – dghkgfzyukz
    Jan 31 at 18:59














6





+50







6





+50



6




+50



$begingroup$

We first prove the result in the case where there is a local maximum. At a maximum, the differential vanishes, so the claim is here:



Lemma. If $(g_n)_n$ is a sequence of $C^1$ functions $mathbb{R}^pto mathbb{R}$ converging uniformly to a $C^1$ function $g$ having a local (strict) maximum at $y$, that is $g(x)<g(y)$ for all $xneq y$ in a ball neigborhood $B(y,r)$ of $y$, then there exist a sequence $x_n$ such that $lim x_n=y$ and $dg_n(x_n)=0$ for sufficiently large $n$.



Proof of the Lemma: Pick $N$ sufficiently large so that $forall ngeq N$,
$$sup_{ |x-y|=r} g_n(x)<g_n(y).$$
The existence of such an $N$ follows from the fact the corresponding inequality is true for $g$ by hypothesis, and the uniform convergence of $g_n$.
For any $ngeq N$, pick $x_n$ to be a maximum of $g_n$ on $B(y,r)$. Because of the previous inequality, $x_n$ is in the interior of the ball $B(y,r)$. So the derivative satisfies $dg_n(x_n)=0$.



Let $x$ be a limit point of a subsequence of $(x_n)$. Since $g_n(x_n)geq g_n(y)$ by definition of $x_n$, taking the limit we have $g(x)geq g(y)$, and of course $x_n$ is still in the closed ball $B(y,r)$. So necessarily $x=y$ since $y$ is a local strict maximum of $g$ on $B(y,r)$. This concludes the proof of the Lemma.



Now, to deal with the general case, pick a point $y$ and define
$$g_n(x)=f_n(x)-f(x)-|x-y|^2.$$
Clearly, this sequence of $C^1$ functions converges to $g(x)=-|x-y|^2.$
We now apply the Lemma, so there is a sequence $(x_n)_n$ such that $lim x_n=y$, and
$dg_n(x_n)=0$. But since
$$dg_n(x_n).h=df_n(x_n).h-df(x_n).h-2langle x_n-y, h rangle,$$
so
$$df_n(x_n)=df(x_n)+ 2langle x_n-y, . rangle.$$
The linear forms $hmapsto 2langle x_n-y, h rangle$ converges to zero because of Cauchy-Scharwz inequality, and $df(x_n)$ converges to $df(y)$ since $f$ is $C^1$. This concludes the proof.






share|cite|improve this answer









$endgroup$



We first prove the result in the case where there is a local maximum. At a maximum, the differential vanishes, so the claim is here:



Lemma. If $(g_n)_n$ is a sequence of $C^1$ functions $mathbb{R}^pto mathbb{R}$ converging uniformly to a $C^1$ function $g$ having a local (strict) maximum at $y$, that is $g(x)<g(y)$ for all $xneq y$ in a ball neigborhood $B(y,r)$ of $y$, then there exist a sequence $x_n$ such that $lim x_n=y$ and $dg_n(x_n)=0$ for sufficiently large $n$.



Proof of the Lemma: Pick $N$ sufficiently large so that $forall ngeq N$,
$$sup_{ |x-y|=r} g_n(x)<g_n(y).$$
The existence of such an $N$ follows from the fact the corresponding inequality is true for $g$ by hypothesis, and the uniform convergence of $g_n$.
For any $ngeq N$, pick $x_n$ to be a maximum of $g_n$ on $B(y,r)$. Because of the previous inequality, $x_n$ is in the interior of the ball $B(y,r)$. So the derivative satisfies $dg_n(x_n)=0$.



Let $x$ be a limit point of a subsequence of $(x_n)$. Since $g_n(x_n)geq g_n(y)$ by definition of $x_n$, taking the limit we have $g(x)geq g(y)$, and of course $x_n$ is still in the closed ball $B(y,r)$. So necessarily $x=y$ since $y$ is a local strict maximum of $g$ on $B(y,r)$. This concludes the proof of the Lemma.



Now, to deal with the general case, pick a point $y$ and define
$$g_n(x)=f_n(x)-f(x)-|x-y|^2.$$
Clearly, this sequence of $C^1$ functions converges to $g(x)=-|x-y|^2.$
We now apply the Lemma, so there is a sequence $(x_n)_n$ such that $lim x_n=y$, and
$dg_n(x_n)=0$. But since
$$dg_n(x_n).h=df_n(x_n).h-df(x_n).h-2langle x_n-y, h rangle,$$
so
$$df_n(x_n)=df(x_n)+ 2langle x_n-y, . rangle.$$
The linear forms $hmapsto 2langle x_n-y, h rangle$ converges to zero because of Cauchy-Scharwz inequality, and $df(x_n)$ converges to $df(y)$ since $f$ is $C^1$. This concludes the proof.







share|cite|improve this answer












share|cite|improve this answer



share|cite|improve this answer










answered Jan 31 at 18:17









user120527user120527

1,910315




1,910315












  • $begingroup$
    This is clever ! I am wondering why it's actually simpler to prove the problem when there is a local minima/maxima. I guess it's because in this case we know what is the value $mathrm{d}f$ at this point. I'll deliver the bounty tonight. Thank you !
    $endgroup$
    – dghkgfzyukz
    Jan 31 at 18:59


















  • $begingroup$
    This is clever ! I am wondering why it's actually simpler to prove the problem when there is a local minima/maxima. I guess it's because in this case we know what is the value $mathrm{d}f$ at this point. I'll deliver the bounty tonight. Thank you !
    $endgroup$
    – dghkgfzyukz
    Jan 31 at 18:59
















$begingroup$
This is clever ! I am wondering why it's actually simpler to prove the problem when there is a local minima/maxima. I guess it's because in this case we know what is the value $mathrm{d}f$ at this point. I'll deliver the bounty tonight. Thank you !
$endgroup$
– dghkgfzyukz
Jan 31 at 18:59




$begingroup$
This is clever ! I am wondering why it's actually simpler to prove the problem when there is a local minima/maxima. I guess it's because in this case we know what is the value $mathrm{d}f$ at this point. I'll deliver the bounty tonight. Thank you !
$endgroup$
– dghkgfzyukz
Jan 31 at 18:59











1












$begingroup$

In one dimension we don't need uniform convergence; pointwise converge will do. Also we only need $f$ and $f_n, n=1,2,dots$ differentiable everywhere, not necessarily $C^1.$



WLOG we can assume $fequiv 0$ because $f_n(x)-f(x)to 0$ everywhere and $f_n-f$ is differentiable everywhere.



Fix $xin mathbb R.$ Let $delta > 0.$ Then for $nin mathbb N$ the MVT shows there exists $c(n,delta)in (x,x+delta)$ such that



$$f_n(x+delta)- f_n(x) = f_n'(c(n,delta))delta.$$



Since the left side $to 0$ as $nto infty,$ we can make the right side as small as we like by taking $n$ large. We can thus find $N = N_delta$ such that $nge N_delta$ implies $|f_n'(c(n,delta))| < delta.$



Now think of $delta_k = 1/k, k=1,2,dots .$ Then from the above there exist integers $0<N_1<N_2 < cdots$ such that $N_kle n < N_{k+1}$ implies $|f_n'(c(n,1/k))| < 1/k.$ If we then define



$$x_n = c(n,1/k),,, N_kle n N_{k+1},$$



we have $x_nto 0$ and $f'(x_n)to 0.$ (We can let $x_n$ be anything for $1le n <N_1.$)






share|cite|improve this answer









$endgroup$













  • $begingroup$
    You can't say WLOG $f = 0$ since at the beginning you are assuming $f$ is not necessarily $C^1$.
    $endgroup$
    – Thinking
    Feb 5 at 19:02










  • $begingroup$
    @Thinking I don't understand your comment.
    $endgroup$
    – zhw.
    Feb 5 at 21:45










  • $begingroup$
    It's not so clear for me that you are not loosing generality by saying $f = 0$, since $f$ is not necessarily $C^1$.
    $endgroup$
    – Thinking
    Feb 5 at 22:18










  • $begingroup$
    @Thinking Like I wrote, I am assuming $f$ and $f_n, n=1,2,dots$ are differentiable everywhere. Where do you think the proof goes wrong?
    $endgroup$
    – zhw.
    Feb 5 at 22:22


















1












$begingroup$

In one dimension we don't need uniform convergence; pointwise converge will do. Also we only need $f$ and $f_n, n=1,2,dots$ differentiable everywhere, not necessarily $C^1.$



WLOG we can assume $fequiv 0$ because $f_n(x)-f(x)to 0$ everywhere and $f_n-f$ is differentiable everywhere.



Fix $xin mathbb R.$ Let $delta > 0.$ Then for $nin mathbb N$ the MVT shows there exists $c(n,delta)in (x,x+delta)$ such that



$$f_n(x+delta)- f_n(x) = f_n'(c(n,delta))delta.$$



Since the left side $to 0$ as $nto infty,$ we can make the right side as small as we like by taking $n$ large. We can thus find $N = N_delta$ such that $nge N_delta$ implies $|f_n'(c(n,delta))| < delta.$



Now think of $delta_k = 1/k, k=1,2,dots .$ Then from the above there exist integers $0<N_1<N_2 < cdots$ such that $N_kle n < N_{k+1}$ implies $|f_n'(c(n,1/k))| < 1/k.$ If we then define



$$x_n = c(n,1/k),,, N_kle n N_{k+1},$$



we have $x_nto 0$ and $f'(x_n)to 0.$ (We can let $x_n$ be anything for $1le n <N_1.$)






share|cite|improve this answer









$endgroup$













  • $begingroup$
    You can't say WLOG $f = 0$ since at the beginning you are assuming $f$ is not necessarily $C^1$.
    $endgroup$
    – Thinking
    Feb 5 at 19:02










  • $begingroup$
    @Thinking I don't understand your comment.
    $endgroup$
    – zhw.
    Feb 5 at 21:45










  • $begingroup$
    It's not so clear for me that you are not loosing generality by saying $f = 0$, since $f$ is not necessarily $C^1$.
    $endgroup$
    – Thinking
    Feb 5 at 22:18










  • $begingroup$
    @Thinking Like I wrote, I am assuming $f$ and $f_n, n=1,2,dots$ are differentiable everywhere. Where do you think the proof goes wrong?
    $endgroup$
    – zhw.
    Feb 5 at 22:22
















1












1








1





$begingroup$

In one dimension we don't need uniform convergence; pointwise converge will do. Also we only need $f$ and $f_n, n=1,2,dots$ differentiable everywhere, not necessarily $C^1.$



WLOG we can assume $fequiv 0$ because $f_n(x)-f(x)to 0$ everywhere and $f_n-f$ is differentiable everywhere.



Fix $xin mathbb R.$ Let $delta > 0.$ Then for $nin mathbb N$ the MVT shows there exists $c(n,delta)in (x,x+delta)$ such that



$$f_n(x+delta)- f_n(x) = f_n'(c(n,delta))delta.$$



Since the left side $to 0$ as $nto infty,$ we can make the right side as small as we like by taking $n$ large. We can thus find $N = N_delta$ such that $nge N_delta$ implies $|f_n'(c(n,delta))| < delta.$



Now think of $delta_k = 1/k, k=1,2,dots .$ Then from the above there exist integers $0<N_1<N_2 < cdots$ such that $N_kle n < N_{k+1}$ implies $|f_n'(c(n,1/k))| < 1/k.$ If we then define



$$x_n = c(n,1/k),,, N_kle n N_{k+1},$$



we have $x_nto 0$ and $f'(x_n)to 0.$ (We can let $x_n$ be anything for $1le n <N_1.$)






share|cite|improve this answer









$endgroup$



In one dimension we don't need uniform convergence; pointwise converge will do. Also we only need $f$ and $f_n, n=1,2,dots$ differentiable everywhere, not necessarily $C^1.$



WLOG we can assume $fequiv 0$ because $f_n(x)-f(x)to 0$ everywhere and $f_n-f$ is differentiable everywhere.



Fix $xin mathbb R.$ Let $delta > 0.$ Then for $nin mathbb N$ the MVT shows there exists $c(n,delta)in (x,x+delta)$ such that



$$f_n(x+delta)- f_n(x) = f_n'(c(n,delta))delta.$$



Since the left side $to 0$ as $nto infty,$ we can make the right side as small as we like by taking $n$ large. We can thus find $N = N_delta$ such that $nge N_delta$ implies $|f_n'(c(n,delta))| < delta.$



Now think of $delta_k = 1/k, k=1,2,dots .$ Then from the above there exist integers $0<N_1<N_2 < cdots$ such that $N_kle n < N_{k+1}$ implies $|f_n'(c(n,1/k))| < 1/k.$ If we then define



$$x_n = c(n,1/k),,, N_kle n N_{k+1},$$



we have $x_nto 0$ and $f'(x_n)to 0.$ (We can let $x_n$ be anything for $1le n <N_1.$)







share|cite|improve this answer












share|cite|improve this answer



share|cite|improve this answer










answered Feb 5 at 18:12









zhw.zhw.

74.8k43175




74.8k43175












  • $begingroup$
    You can't say WLOG $f = 0$ since at the beginning you are assuming $f$ is not necessarily $C^1$.
    $endgroup$
    – Thinking
    Feb 5 at 19:02










  • $begingroup$
    @Thinking I don't understand your comment.
    $endgroup$
    – zhw.
    Feb 5 at 21:45










  • $begingroup$
    It's not so clear for me that you are not loosing generality by saying $f = 0$, since $f$ is not necessarily $C^1$.
    $endgroup$
    – Thinking
    Feb 5 at 22:18










  • $begingroup$
    @Thinking Like I wrote, I am assuming $f$ and $f_n, n=1,2,dots$ are differentiable everywhere. Where do you think the proof goes wrong?
    $endgroup$
    – zhw.
    Feb 5 at 22:22




















  • $begingroup$
    You can't say WLOG $f = 0$ since at the beginning you are assuming $f$ is not necessarily $C^1$.
    $endgroup$
    – Thinking
    Feb 5 at 19:02










  • $begingroup$
    @Thinking I don't understand your comment.
    $endgroup$
    – zhw.
    Feb 5 at 21:45










  • $begingroup$
    It's not so clear for me that you are not loosing generality by saying $f = 0$, since $f$ is not necessarily $C^1$.
    $endgroup$
    – Thinking
    Feb 5 at 22:18










  • $begingroup$
    @Thinking Like I wrote, I am assuming $f$ and $f_n, n=1,2,dots$ are differentiable everywhere. Where do you think the proof goes wrong?
    $endgroup$
    – zhw.
    Feb 5 at 22:22


















$begingroup$
You can't say WLOG $f = 0$ since at the beginning you are assuming $f$ is not necessarily $C^1$.
$endgroup$
– Thinking
Feb 5 at 19:02




$begingroup$
You can't say WLOG $f = 0$ since at the beginning you are assuming $f$ is not necessarily $C^1$.
$endgroup$
– Thinking
Feb 5 at 19:02












$begingroup$
@Thinking I don't understand your comment.
$endgroup$
– zhw.
Feb 5 at 21:45




$begingroup$
@Thinking I don't understand your comment.
$endgroup$
– zhw.
Feb 5 at 21:45












$begingroup$
It's not so clear for me that you are not loosing generality by saying $f = 0$, since $f$ is not necessarily $C^1$.
$endgroup$
– Thinking
Feb 5 at 22:18




$begingroup$
It's not so clear for me that you are not loosing generality by saying $f = 0$, since $f$ is not necessarily $C^1$.
$endgroup$
– Thinking
Feb 5 at 22:18












$begingroup$
@Thinking Like I wrote, I am assuming $f$ and $f_n, n=1,2,dots$ are differentiable everywhere. Where do you think the proof goes wrong?
$endgroup$
– zhw.
Feb 5 at 22:22






$begingroup$
@Thinking Like I wrote, I am assuming $f$ and $f_n, n=1,2,dots$ are differentiable everywhere. Where do you think the proof goes wrong?
$endgroup$
– zhw.
Feb 5 at 22:22




















draft saved

draft discarded




















































Thanks for contributing an answer to Mathematics Stack Exchange!


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


Use MathJax to format equations. MathJax reference.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3091545%2ff-n-converge-uniformly-to-f-then-mathrmdf-nx-n-converges-to-mathrm%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

MongoDB - Not Authorized To Execute Command

Npm cannot find a required file even through it is in the searched directory

How to fix TextFormField cause rebuild widget in Flutter