Intuition behind speciality of symmetric matrices
$begingroup$
What is the geometric intuition behind the fact that only matrices that are similar to a symmetric matrix are diagonizable?
So e.g. why is it important that the multiplier of the the first component of the last basis vector be the same as the multiplier of the the last component of the first basis vector(i.e. that in an $n*n $ matrix $(n,1)$ be the same as $(1,n)$)?
linear-algebra
$endgroup$
add a comment |
$begingroup$
What is the geometric intuition behind the fact that only matrices that are similar to a symmetric matrix are diagonizable?
So e.g. why is it important that the multiplier of the the first component of the last basis vector be the same as the multiplier of the the last component of the first basis vector(i.e. that in an $n*n $ matrix $(n,1)$ be the same as $(1,n)$)?
linear-algebra
$endgroup$
$begingroup$
It seems that what you should really be after is "why are symmetric matrices diagonalizable"? You should look into proofs of the spectral theorem.
$endgroup$
– Omnomnomnom
May 17 '16 at 13:47
add a comment |
$begingroup$
What is the geometric intuition behind the fact that only matrices that are similar to a symmetric matrix are diagonizable?
So e.g. why is it important that the multiplier of the the first component of the last basis vector be the same as the multiplier of the the last component of the first basis vector(i.e. that in an $n*n $ matrix $(n,1)$ be the same as $(1,n)$)?
linear-algebra
$endgroup$
What is the geometric intuition behind the fact that only matrices that are similar to a symmetric matrix are diagonizable?
So e.g. why is it important that the multiplier of the the first component of the last basis vector be the same as the multiplier of the the last component of the first basis vector(i.e. that in an $n*n $ matrix $(n,1)$ be the same as $(1,n)$)?
linear-algebra
linear-algebra
asked May 17 '16 at 12:34
NesaNesa
570313
570313
$begingroup$
It seems that what you should really be after is "why are symmetric matrices diagonalizable"? You should look into proofs of the spectral theorem.
$endgroup$
– Omnomnomnom
May 17 '16 at 13:47
add a comment |
$begingroup$
It seems that what you should really be after is "why are symmetric matrices diagonalizable"? You should look into proofs of the spectral theorem.
$endgroup$
– Omnomnomnom
May 17 '16 at 13:47
$begingroup$
It seems that what you should really be after is "why are symmetric matrices diagonalizable"? You should look into proofs of the spectral theorem.
$endgroup$
– Omnomnomnom
May 17 '16 at 13:47
$begingroup$
It seems that what you should really be after is "why are symmetric matrices diagonalizable"? You should look into proofs of the spectral theorem.
$endgroup$
– Omnomnomnom
May 17 '16 at 13:47
add a comment |
2 Answers
2
active
oldest
votes
$begingroup$
When you were first learning about null spaces in linear algebra, your guess for the null space -- assuming you had some reasonable geometric intuition into the field -- was that the null space was orthogonal to the column space. After all, that makes sense. If your singular transformation collapses/projects $mathbb{R}^2$ into a line, then the vectors that get mapped to the origin are the ones perpendicular to the column space.
Or at least, so it seems -- in reality, though, the projection doesn't need to be so nice and orthogonal. You could, for instance, rotate all vectors in the space by some angle and then collapse it onto a line.
It turns out the null space isn't perpendicular to the column space, but in fact to the row space instead -- these two spaces are only identical for matrices which do not perform a rotation.
This is a very important observation, because it tells you something about the character of matrices -- asymmetry in a matrix is a measure of how rotation-ish it is. Specifically, an antisymmetric matrix is the result of 90-degree rotations (like imaginary numbers) and a symmetric matrix is the result of scaling and skews (like real numbers).
$$A = underbrace {frac{1}{2}(A + {A^T})}_{scriptstyle{rm{symmetric }}atopscriptstyle{rm{part}}} + underbrace {frac{1}{2}(A - {A^T})}_{scriptstyle{rm{antisymmetric }}atopscriptstyle{rm{part}}}$$
All matrices can bet written as the sum of these two kinds -- a symmetric part and an anti-symmetric part -- much like all complex numbers can be written as the sum of a real part and an imaginary part. And this is fundamentally why symmetric matrices are "special" -- for the same reason that real numbers are special.
Notes:
(1) Scaling and skews are actually essentially the same thing, which is why it makes sense to include skews in the group of things that are "essentially real numbers", even though you can't really represent skews with any complex number -- real or otherwise. Skews are just scaling across a different set of axes, called "eigenvectors" (this is also why symmetric matrices have eigenvectors).
(2) My explanation of the analogy (between matrices and complex numbers) is oversimplified -- antisymmetric matrices actually represent 90 degree rotations only, and these rotations can actually be spirals, which means they do scaling too. But the analogy still holds, because this applies to imaginary numbers too (e.g. the complex number $8i$ is a rotation by 90 degrees followed by a scaling by 8).
(3) A more accurate way to phrase the analogy is "the antisymmetric part of the matrix operates in a sub-space orthogonal to the vector being transformed while the symmetric part operates in the direction of the vector itself, so their sum spans all possible vectors of the target space". In other words, the analogy is to the Cartesian form of complex numbers -- you get to represent transformations as linear combinations of the vector itself and vectors orthogonal to it.
(4) It is possible to deal with at least some matrices in a way that corresponds to the polar forms of complex numbers -- this is done by representing matrices as products of symmetric matrices and orthogonal matrices, much like $re^{itheta}$ represents complex numbers as products of real numbers and unit complex numbers.
$endgroup$
add a comment |
$begingroup$
only matrices that are similar to a symmetric matrix are diagonizable
This statement is true, but quite useless. I prefer the statement
only matrices that are similar to a diagonal matrix are diagonizable
which is, hum... the definition itself of diagonalizability. The geometrical intuition of diagonalizability is that you can decompose a transformation in homotheties, which are the simplest geometrical transformations you could imagine.
$endgroup$
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f1788911%2fintuition-behind-speciality-of-symmetric-matrices%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
When you were first learning about null spaces in linear algebra, your guess for the null space -- assuming you had some reasonable geometric intuition into the field -- was that the null space was orthogonal to the column space. After all, that makes sense. If your singular transformation collapses/projects $mathbb{R}^2$ into a line, then the vectors that get mapped to the origin are the ones perpendicular to the column space.
Or at least, so it seems -- in reality, though, the projection doesn't need to be so nice and orthogonal. You could, for instance, rotate all vectors in the space by some angle and then collapse it onto a line.
It turns out the null space isn't perpendicular to the column space, but in fact to the row space instead -- these two spaces are only identical for matrices which do not perform a rotation.
This is a very important observation, because it tells you something about the character of matrices -- asymmetry in a matrix is a measure of how rotation-ish it is. Specifically, an antisymmetric matrix is the result of 90-degree rotations (like imaginary numbers) and a symmetric matrix is the result of scaling and skews (like real numbers).
$$A = underbrace {frac{1}{2}(A + {A^T})}_{scriptstyle{rm{symmetric }}atopscriptstyle{rm{part}}} + underbrace {frac{1}{2}(A - {A^T})}_{scriptstyle{rm{antisymmetric }}atopscriptstyle{rm{part}}}$$
All matrices can bet written as the sum of these two kinds -- a symmetric part and an anti-symmetric part -- much like all complex numbers can be written as the sum of a real part and an imaginary part. And this is fundamentally why symmetric matrices are "special" -- for the same reason that real numbers are special.
Notes:
(1) Scaling and skews are actually essentially the same thing, which is why it makes sense to include skews in the group of things that are "essentially real numbers", even though you can't really represent skews with any complex number -- real or otherwise. Skews are just scaling across a different set of axes, called "eigenvectors" (this is also why symmetric matrices have eigenvectors).
(2) My explanation of the analogy (between matrices and complex numbers) is oversimplified -- antisymmetric matrices actually represent 90 degree rotations only, and these rotations can actually be spirals, which means they do scaling too. But the analogy still holds, because this applies to imaginary numbers too (e.g. the complex number $8i$ is a rotation by 90 degrees followed by a scaling by 8).
(3) A more accurate way to phrase the analogy is "the antisymmetric part of the matrix operates in a sub-space orthogonal to the vector being transformed while the symmetric part operates in the direction of the vector itself, so their sum spans all possible vectors of the target space". In other words, the analogy is to the Cartesian form of complex numbers -- you get to represent transformations as linear combinations of the vector itself and vectors orthogonal to it.
(4) It is possible to deal with at least some matrices in a way that corresponds to the polar forms of complex numbers -- this is done by representing matrices as products of symmetric matrices and orthogonal matrices, much like $re^{itheta}$ represents complex numbers as products of real numbers and unit complex numbers.
$endgroup$
add a comment |
$begingroup$
When you were first learning about null spaces in linear algebra, your guess for the null space -- assuming you had some reasonable geometric intuition into the field -- was that the null space was orthogonal to the column space. After all, that makes sense. If your singular transformation collapses/projects $mathbb{R}^2$ into a line, then the vectors that get mapped to the origin are the ones perpendicular to the column space.
Or at least, so it seems -- in reality, though, the projection doesn't need to be so nice and orthogonal. You could, for instance, rotate all vectors in the space by some angle and then collapse it onto a line.
It turns out the null space isn't perpendicular to the column space, but in fact to the row space instead -- these two spaces are only identical for matrices which do not perform a rotation.
This is a very important observation, because it tells you something about the character of matrices -- asymmetry in a matrix is a measure of how rotation-ish it is. Specifically, an antisymmetric matrix is the result of 90-degree rotations (like imaginary numbers) and a symmetric matrix is the result of scaling and skews (like real numbers).
$$A = underbrace {frac{1}{2}(A + {A^T})}_{scriptstyle{rm{symmetric }}atopscriptstyle{rm{part}}} + underbrace {frac{1}{2}(A - {A^T})}_{scriptstyle{rm{antisymmetric }}atopscriptstyle{rm{part}}}$$
All matrices can bet written as the sum of these two kinds -- a symmetric part and an anti-symmetric part -- much like all complex numbers can be written as the sum of a real part and an imaginary part. And this is fundamentally why symmetric matrices are "special" -- for the same reason that real numbers are special.
Notes:
(1) Scaling and skews are actually essentially the same thing, which is why it makes sense to include skews in the group of things that are "essentially real numbers", even though you can't really represent skews with any complex number -- real or otherwise. Skews are just scaling across a different set of axes, called "eigenvectors" (this is also why symmetric matrices have eigenvectors).
(2) My explanation of the analogy (between matrices and complex numbers) is oversimplified -- antisymmetric matrices actually represent 90 degree rotations only, and these rotations can actually be spirals, which means they do scaling too. But the analogy still holds, because this applies to imaginary numbers too (e.g. the complex number $8i$ is a rotation by 90 degrees followed by a scaling by 8).
(3) A more accurate way to phrase the analogy is "the antisymmetric part of the matrix operates in a sub-space orthogonal to the vector being transformed while the symmetric part operates in the direction of the vector itself, so their sum spans all possible vectors of the target space". In other words, the analogy is to the Cartesian form of complex numbers -- you get to represent transformations as linear combinations of the vector itself and vectors orthogonal to it.
(4) It is possible to deal with at least some matrices in a way that corresponds to the polar forms of complex numbers -- this is done by representing matrices as products of symmetric matrices and orthogonal matrices, much like $re^{itheta}$ represents complex numbers as products of real numbers and unit complex numbers.
$endgroup$
add a comment |
$begingroup$
When you were first learning about null spaces in linear algebra, your guess for the null space -- assuming you had some reasonable geometric intuition into the field -- was that the null space was orthogonal to the column space. After all, that makes sense. If your singular transformation collapses/projects $mathbb{R}^2$ into a line, then the vectors that get mapped to the origin are the ones perpendicular to the column space.
Or at least, so it seems -- in reality, though, the projection doesn't need to be so nice and orthogonal. You could, for instance, rotate all vectors in the space by some angle and then collapse it onto a line.
It turns out the null space isn't perpendicular to the column space, but in fact to the row space instead -- these two spaces are only identical for matrices which do not perform a rotation.
This is a very important observation, because it tells you something about the character of matrices -- asymmetry in a matrix is a measure of how rotation-ish it is. Specifically, an antisymmetric matrix is the result of 90-degree rotations (like imaginary numbers) and a symmetric matrix is the result of scaling and skews (like real numbers).
$$A = underbrace {frac{1}{2}(A + {A^T})}_{scriptstyle{rm{symmetric }}atopscriptstyle{rm{part}}} + underbrace {frac{1}{2}(A - {A^T})}_{scriptstyle{rm{antisymmetric }}atopscriptstyle{rm{part}}}$$
All matrices can bet written as the sum of these two kinds -- a symmetric part and an anti-symmetric part -- much like all complex numbers can be written as the sum of a real part and an imaginary part. And this is fundamentally why symmetric matrices are "special" -- for the same reason that real numbers are special.
Notes:
(1) Scaling and skews are actually essentially the same thing, which is why it makes sense to include skews in the group of things that are "essentially real numbers", even though you can't really represent skews with any complex number -- real or otherwise. Skews are just scaling across a different set of axes, called "eigenvectors" (this is also why symmetric matrices have eigenvectors).
(2) My explanation of the analogy (between matrices and complex numbers) is oversimplified -- antisymmetric matrices actually represent 90 degree rotations only, and these rotations can actually be spirals, which means they do scaling too. But the analogy still holds, because this applies to imaginary numbers too (e.g. the complex number $8i$ is a rotation by 90 degrees followed by a scaling by 8).
(3) A more accurate way to phrase the analogy is "the antisymmetric part of the matrix operates in a sub-space orthogonal to the vector being transformed while the symmetric part operates in the direction of the vector itself, so their sum spans all possible vectors of the target space". In other words, the analogy is to the Cartesian form of complex numbers -- you get to represent transformations as linear combinations of the vector itself and vectors orthogonal to it.
(4) It is possible to deal with at least some matrices in a way that corresponds to the polar forms of complex numbers -- this is done by representing matrices as products of symmetric matrices and orthogonal matrices, much like $re^{itheta}$ represents complex numbers as products of real numbers and unit complex numbers.
$endgroup$
When you were first learning about null spaces in linear algebra, your guess for the null space -- assuming you had some reasonable geometric intuition into the field -- was that the null space was orthogonal to the column space. After all, that makes sense. If your singular transformation collapses/projects $mathbb{R}^2$ into a line, then the vectors that get mapped to the origin are the ones perpendicular to the column space.
Or at least, so it seems -- in reality, though, the projection doesn't need to be so nice and orthogonal. You could, for instance, rotate all vectors in the space by some angle and then collapse it onto a line.
It turns out the null space isn't perpendicular to the column space, but in fact to the row space instead -- these two spaces are only identical for matrices which do not perform a rotation.
This is a very important observation, because it tells you something about the character of matrices -- asymmetry in a matrix is a measure of how rotation-ish it is. Specifically, an antisymmetric matrix is the result of 90-degree rotations (like imaginary numbers) and a symmetric matrix is the result of scaling and skews (like real numbers).
$$A = underbrace {frac{1}{2}(A + {A^T})}_{scriptstyle{rm{symmetric }}atopscriptstyle{rm{part}}} + underbrace {frac{1}{2}(A - {A^T})}_{scriptstyle{rm{antisymmetric }}atopscriptstyle{rm{part}}}$$
All matrices can bet written as the sum of these two kinds -- a symmetric part and an anti-symmetric part -- much like all complex numbers can be written as the sum of a real part and an imaginary part. And this is fundamentally why symmetric matrices are "special" -- for the same reason that real numbers are special.
Notes:
(1) Scaling and skews are actually essentially the same thing, which is why it makes sense to include skews in the group of things that are "essentially real numbers", even though you can't really represent skews with any complex number -- real or otherwise. Skews are just scaling across a different set of axes, called "eigenvectors" (this is also why symmetric matrices have eigenvectors).
(2) My explanation of the analogy (between matrices and complex numbers) is oversimplified -- antisymmetric matrices actually represent 90 degree rotations only, and these rotations can actually be spirals, which means they do scaling too. But the analogy still holds, because this applies to imaginary numbers too (e.g. the complex number $8i$ is a rotation by 90 degrees followed by a scaling by 8).
(3) A more accurate way to phrase the analogy is "the antisymmetric part of the matrix operates in a sub-space orthogonal to the vector being transformed while the symmetric part operates in the direction of the vector itself, so their sum spans all possible vectors of the target space". In other words, the analogy is to the Cartesian form of complex numbers -- you get to represent transformations as linear combinations of the vector itself and vectors orthogonal to it.
(4) It is possible to deal with at least some matrices in a way that corresponds to the polar forms of complex numbers -- this is done by representing matrices as products of symmetric matrices and orthogonal matrices, much like $re^{itheta}$ represents complex numbers as products of real numbers and unit complex numbers.
edited Jan 16 at 11:25
answered May 14 '18 at 8:42


Abhimanyu Pallavi SudhirAbhimanyu Pallavi Sudhir
909719
909719
add a comment |
add a comment |
$begingroup$
only matrices that are similar to a symmetric matrix are diagonizable
This statement is true, but quite useless. I prefer the statement
only matrices that are similar to a diagonal matrix are diagonizable
which is, hum... the definition itself of diagonalizability. The geometrical intuition of diagonalizability is that you can decompose a transformation in homotheties, which are the simplest geometrical transformations you could imagine.
$endgroup$
add a comment |
$begingroup$
only matrices that are similar to a symmetric matrix are diagonizable
This statement is true, but quite useless. I prefer the statement
only matrices that are similar to a diagonal matrix are diagonizable
which is, hum... the definition itself of diagonalizability. The geometrical intuition of diagonalizability is that you can decompose a transformation in homotheties, which are the simplest geometrical transformations you could imagine.
$endgroup$
add a comment |
$begingroup$
only matrices that are similar to a symmetric matrix are diagonizable
This statement is true, but quite useless. I prefer the statement
only matrices that are similar to a diagonal matrix are diagonizable
which is, hum... the definition itself of diagonalizability. The geometrical intuition of diagonalizability is that you can decompose a transformation in homotheties, which are the simplest geometrical transformations you could imagine.
$endgroup$
only matrices that are similar to a symmetric matrix are diagonizable
This statement is true, but quite useless. I prefer the statement
only matrices that are similar to a diagonal matrix are diagonizable
which is, hum... the definition itself of diagonalizability. The geometrical intuition of diagonalizability is that you can decompose a transformation in homotheties, which are the simplest geometrical transformations you could imagine.
answered May 17 '16 at 12:38


R. BourgeonR. Bourgeon
68326
68326
add a comment |
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f1788911%2fintuition-behind-speciality-of-symmetric-matrices%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
$begingroup$
It seems that what you should really be after is "why are symmetric matrices diagonalizable"? You should look into proofs of the spectral theorem.
$endgroup$
– Omnomnomnom
May 17 '16 at 13:47