Derivative of the matrix of eigenvalues of a real symmetric matrix
Given a real symmetric matrix $A$ with entries depending on $t$, the derivative $p$-th eigenvalue with respect to $t$ is given by
$$
lambda_p' = v_p^T A'v_p
$$
where $A'$ denotes the derivative of matrix $A$. This can be derived by premultiplying
$$
A' v_p + A v_p' = lambda_p' v_p + lambda_p v_p'
$$
with $v_p^T$, imposing that the eigenvectors have unit length (and thus $v_p cdot v_p' = 0$), and making use of the fact that $A$ is symmetric.
Say I want to use the same approach on the eigendecomposition of $A$
$$
A V = V Lambda \
A'V + A V' = V' Lambda + V Lambda' \
V^T A'V + V^T A V' = V^T V' Lambda + V^T V Lambda' \
V^T A'V + V^T A V' = V^T V' Lambda + Lambda' \
Lambda' = V^T A'V + V^T A V' - V^T V' Lambda \
Lambda' = V^T A'V + Lambda V^T V' - V^T V' Lambda \
$$
also
$$
I' = (V^T V)' = {V^T}'V + V^T V' = 0
$$
However, I don't see how I could use this statement to simplify the earlier expression. I feel like I'm missing something really obvious.
How can I express the derivative of the matrix of eigenvalues $Lambda$ in terms of the derivative of $A$?
linear-algebra matrices derivatives symmetric-matrices
|
show 2 more comments
Given a real symmetric matrix $A$ with entries depending on $t$, the derivative $p$-th eigenvalue with respect to $t$ is given by
$$
lambda_p' = v_p^T A'v_p
$$
where $A'$ denotes the derivative of matrix $A$. This can be derived by premultiplying
$$
A' v_p + A v_p' = lambda_p' v_p + lambda_p v_p'
$$
with $v_p^T$, imposing that the eigenvectors have unit length (and thus $v_p cdot v_p' = 0$), and making use of the fact that $A$ is symmetric.
Say I want to use the same approach on the eigendecomposition of $A$
$$
A V = V Lambda \
A'V + A V' = V' Lambda + V Lambda' \
V^T A'V + V^T A V' = V^T V' Lambda + V^T V Lambda' \
V^T A'V + V^T A V' = V^T V' Lambda + Lambda' \
Lambda' = V^T A'V + V^T A V' - V^T V' Lambda \
Lambda' = V^T A'V + Lambda V^T V' - V^T V' Lambda \
$$
also
$$
I' = (V^T V)' = {V^T}'V + V^T V' = 0
$$
However, I don't see how I could use this statement to simplify the earlier expression. I feel like I'm missing something really obvious.
How can I express the derivative of the matrix of eigenvalues $Lambda$ in terms of the derivative of $A$?
linear-algebra matrices derivatives symmetric-matrices
Is $A$ a function of $t$, or something like that? I ask because you differentiate, but you do not specify with respect to what. Also "eigenvalues have unit length" makes no sense.
– Giuseppe Negro
Nov 22 '18 at 11:20
Sorry, I edited the typo. Yes, for example the entries of $A$ could be a function of $t$ and we are taking the derivative with respect to $t$.
– user495268
Nov 22 '18 at 11:24
It is much better to specify the dependence on $t$ in the main text. That's an important point.
– Giuseppe Negro
Nov 22 '18 at 11:26
Now I finally understood your question. It is a nice one, +1. I have a semi-serious question; I suppose that $$Lambda'=mathrm{diag}(V_1^TA'V_1, V_2^TA'V_2,ldots, V_n^TA'V_n), $$ where $V_j$ denotes the $j$-th column of $V$, is not the answer you are looking for?
– Giuseppe Negro
Nov 22 '18 at 12:13
Oh, and wait a minute. If you impose that the eigenvectors are orthonormal, as you may since $A$ is symmetric, then $V$ is an orthogonal matrix, that is, $V^TV=I$. Differentiating this you should get something, just like in the beginning of the post you differentiated $v_icdot v_j=delta_{ij}$. (HTH)
– Giuseppe Negro
Nov 22 '18 at 12:15
|
show 2 more comments
Given a real symmetric matrix $A$ with entries depending on $t$, the derivative $p$-th eigenvalue with respect to $t$ is given by
$$
lambda_p' = v_p^T A'v_p
$$
where $A'$ denotes the derivative of matrix $A$. This can be derived by premultiplying
$$
A' v_p + A v_p' = lambda_p' v_p + lambda_p v_p'
$$
with $v_p^T$, imposing that the eigenvectors have unit length (and thus $v_p cdot v_p' = 0$), and making use of the fact that $A$ is symmetric.
Say I want to use the same approach on the eigendecomposition of $A$
$$
A V = V Lambda \
A'V + A V' = V' Lambda + V Lambda' \
V^T A'V + V^T A V' = V^T V' Lambda + V^T V Lambda' \
V^T A'V + V^T A V' = V^T V' Lambda + Lambda' \
Lambda' = V^T A'V + V^T A V' - V^T V' Lambda \
Lambda' = V^T A'V + Lambda V^T V' - V^T V' Lambda \
$$
also
$$
I' = (V^T V)' = {V^T}'V + V^T V' = 0
$$
However, I don't see how I could use this statement to simplify the earlier expression. I feel like I'm missing something really obvious.
How can I express the derivative of the matrix of eigenvalues $Lambda$ in terms of the derivative of $A$?
linear-algebra matrices derivatives symmetric-matrices
Given a real symmetric matrix $A$ with entries depending on $t$, the derivative $p$-th eigenvalue with respect to $t$ is given by
$$
lambda_p' = v_p^T A'v_p
$$
where $A'$ denotes the derivative of matrix $A$. This can be derived by premultiplying
$$
A' v_p + A v_p' = lambda_p' v_p + lambda_p v_p'
$$
with $v_p^T$, imposing that the eigenvectors have unit length (and thus $v_p cdot v_p' = 0$), and making use of the fact that $A$ is symmetric.
Say I want to use the same approach on the eigendecomposition of $A$
$$
A V = V Lambda \
A'V + A V' = V' Lambda + V Lambda' \
V^T A'V + V^T A V' = V^T V' Lambda + V^T V Lambda' \
V^T A'V + V^T A V' = V^T V' Lambda + Lambda' \
Lambda' = V^T A'V + V^T A V' - V^T V' Lambda \
Lambda' = V^T A'V + Lambda V^T V' - V^T V' Lambda \
$$
also
$$
I' = (V^T V)' = {V^T}'V + V^T V' = 0
$$
However, I don't see how I could use this statement to simplify the earlier expression. I feel like I'm missing something really obvious.
How can I express the derivative of the matrix of eigenvalues $Lambda$ in terms of the derivative of $A$?
linear-algebra matrices derivatives symmetric-matrices
linear-algebra matrices derivatives symmetric-matrices
edited Nov 22 '18 at 11:27
user495268
asked Nov 22 '18 at 11:15
user495268user495268
183
183
Is $A$ a function of $t$, or something like that? I ask because you differentiate, but you do not specify with respect to what. Also "eigenvalues have unit length" makes no sense.
– Giuseppe Negro
Nov 22 '18 at 11:20
Sorry, I edited the typo. Yes, for example the entries of $A$ could be a function of $t$ and we are taking the derivative with respect to $t$.
– user495268
Nov 22 '18 at 11:24
It is much better to specify the dependence on $t$ in the main text. That's an important point.
– Giuseppe Negro
Nov 22 '18 at 11:26
Now I finally understood your question. It is a nice one, +1. I have a semi-serious question; I suppose that $$Lambda'=mathrm{diag}(V_1^TA'V_1, V_2^TA'V_2,ldots, V_n^TA'V_n), $$ where $V_j$ denotes the $j$-th column of $V$, is not the answer you are looking for?
– Giuseppe Negro
Nov 22 '18 at 12:13
Oh, and wait a minute. If you impose that the eigenvectors are orthonormal, as you may since $A$ is symmetric, then $V$ is an orthogonal matrix, that is, $V^TV=I$. Differentiating this you should get something, just like in the beginning of the post you differentiated $v_icdot v_j=delta_{ij}$. (HTH)
– Giuseppe Negro
Nov 22 '18 at 12:15
|
show 2 more comments
Is $A$ a function of $t$, or something like that? I ask because you differentiate, but you do not specify with respect to what. Also "eigenvalues have unit length" makes no sense.
– Giuseppe Negro
Nov 22 '18 at 11:20
Sorry, I edited the typo. Yes, for example the entries of $A$ could be a function of $t$ and we are taking the derivative with respect to $t$.
– user495268
Nov 22 '18 at 11:24
It is much better to specify the dependence on $t$ in the main text. That's an important point.
– Giuseppe Negro
Nov 22 '18 at 11:26
Now I finally understood your question. It is a nice one, +1. I have a semi-serious question; I suppose that $$Lambda'=mathrm{diag}(V_1^TA'V_1, V_2^TA'V_2,ldots, V_n^TA'V_n), $$ where $V_j$ denotes the $j$-th column of $V$, is not the answer you are looking for?
– Giuseppe Negro
Nov 22 '18 at 12:13
Oh, and wait a minute. If you impose that the eigenvectors are orthonormal, as you may since $A$ is symmetric, then $V$ is an orthogonal matrix, that is, $V^TV=I$. Differentiating this you should get something, just like in the beginning of the post you differentiated $v_icdot v_j=delta_{ij}$. (HTH)
– Giuseppe Negro
Nov 22 '18 at 12:15
Is $A$ a function of $t$, or something like that? I ask because you differentiate, but you do not specify with respect to what. Also "eigenvalues have unit length" makes no sense.
– Giuseppe Negro
Nov 22 '18 at 11:20
Is $A$ a function of $t$, or something like that? I ask because you differentiate, but you do not specify with respect to what. Also "eigenvalues have unit length" makes no sense.
– Giuseppe Negro
Nov 22 '18 at 11:20
Sorry, I edited the typo. Yes, for example the entries of $A$ could be a function of $t$ and we are taking the derivative with respect to $t$.
– user495268
Nov 22 '18 at 11:24
Sorry, I edited the typo. Yes, for example the entries of $A$ could be a function of $t$ and we are taking the derivative with respect to $t$.
– user495268
Nov 22 '18 at 11:24
It is much better to specify the dependence on $t$ in the main text. That's an important point.
– Giuseppe Negro
Nov 22 '18 at 11:26
It is much better to specify the dependence on $t$ in the main text. That's an important point.
– Giuseppe Negro
Nov 22 '18 at 11:26
Now I finally understood your question. It is a nice one, +1. I have a semi-serious question; I suppose that $$Lambda'=mathrm{diag}(V_1^TA'V_1, V_2^TA'V_2,ldots, V_n^TA'V_n), $$ where $V_j$ denotes the $j$-th column of $V$, is not the answer you are looking for?
– Giuseppe Negro
Nov 22 '18 at 12:13
Now I finally understood your question. It is a nice one, +1. I have a semi-serious question; I suppose that $$Lambda'=mathrm{diag}(V_1^TA'V_1, V_2^TA'V_2,ldots, V_n^TA'V_n), $$ where $V_j$ denotes the $j$-th column of $V$, is not the answer you are looking for?
– Giuseppe Negro
Nov 22 '18 at 12:13
Oh, and wait a minute. If you impose that the eigenvectors are orthonormal, as you may since $A$ is symmetric, then $V$ is an orthogonal matrix, that is, $V^TV=I$. Differentiating this you should get something, just like in the beginning of the post you differentiated $v_icdot v_j=delta_{ij}$. (HTH)
– Giuseppe Negro
Nov 22 '18 at 12:15
Oh, and wait a minute. If you impose that the eigenvectors are orthonormal, as you may since $A$ is symmetric, then $V$ is an orthogonal matrix, that is, $V^TV=I$. Differentiating this you should get something, just like in the beginning of the post you differentiated $v_icdot v_j=delta_{ij}$. (HTH)
– Giuseppe Negro
Nov 22 '18 at 12:15
|
show 2 more comments
0
active
oldest
votes
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3009013%2fderivative-of-the-matrix-of-eigenvalues-of-a-real-symmetric-matrix%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
0
active
oldest
votes
0
active
oldest
votes
active
oldest
votes
active
oldest
votes
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Some of your past answers have not been well-received, and you're in danger of being blocked from answering.
Please pay close attention to the following guidance:
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3009013%2fderivative-of-the-matrix-of-eigenvalues-of-a-real-symmetric-matrix%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Is $A$ a function of $t$, or something like that? I ask because you differentiate, but you do not specify with respect to what. Also "eigenvalues have unit length" makes no sense.
– Giuseppe Negro
Nov 22 '18 at 11:20
Sorry, I edited the typo. Yes, for example the entries of $A$ could be a function of $t$ and we are taking the derivative with respect to $t$.
– user495268
Nov 22 '18 at 11:24
It is much better to specify the dependence on $t$ in the main text. That's an important point.
– Giuseppe Negro
Nov 22 '18 at 11:26
Now I finally understood your question. It is a nice one, +1. I have a semi-serious question; I suppose that $$Lambda'=mathrm{diag}(V_1^TA'V_1, V_2^TA'V_2,ldots, V_n^TA'V_n), $$ where $V_j$ denotes the $j$-th column of $V$, is not the answer you are looking for?
– Giuseppe Negro
Nov 22 '18 at 12:13
Oh, and wait a minute. If you impose that the eigenvectors are orthonormal, as you may since $A$ is symmetric, then $V$ is an orthogonal matrix, that is, $V^TV=I$. Differentiating this you should get something, just like in the beginning of the post you differentiated $v_icdot v_j=delta_{ij}$. (HTH)
– Giuseppe Negro
Nov 22 '18 at 12:15