Block diagonalizing two matrices simultaneously
$begingroup$
There are two matrices $A$ and $B$ which can not be diagonalized simultaneously. Is it possible to block diagonalize them? What if the matrices have an special pattern?
Physics of the problem is that, I have $2$ layers of atoms where $A$ is connectivity within the layer $1$ itself and $B$ is connectivity between layer $1$ and $2$.
By block diagoanl matrix I mean a matrix which its diagonals are some matrices with size $M times M$ (where M is much smaller than the actual size of matrix but larger than 1)
For example a $3 times 3$ block diagonal matrix
[ begin{array}{ccc}
[a] & [0] & [0] \
[0] & [b] & [0]\
[0] & [0] & [c] end{array}
]
I have attached the figures of sparsity pattern of an example $A$ and $B$.
matrices eigenvalues-eigenvectors diagonalization block-matrices eigenfunctions
$endgroup$
|
show 2 more comments
$begingroup$
There are two matrices $A$ and $B$ which can not be diagonalized simultaneously. Is it possible to block diagonalize them? What if the matrices have an special pattern?
Physics of the problem is that, I have $2$ layers of atoms where $A$ is connectivity within the layer $1$ itself and $B$ is connectivity between layer $1$ and $2$.
By block diagoanl matrix I mean a matrix which its diagonals are some matrices with size $M times M$ (where M is much smaller than the actual size of matrix but larger than 1)
For example a $3 times 3$ block diagonal matrix
[ begin{array}{ccc}
[a] & [0] & [0] \
[0] & [b] & [0]\
[0] & [0] & [c] end{array}
]
I have attached the figures of sparsity pattern of an example $A$ and $B$.
matrices eigenvalues-eigenvectors diagonalization block-matrices eigenfunctions
$endgroup$
2
$begingroup$
Perhaps your question will make a lot of sense to physicists, but I think you did not take enough time to translate it into something understandable by mathematicians. What do you mean by "block diagonalize" ? Perhaps you could point to a wikipedia link explaining what the "connectivity" is.
$endgroup$
– Ewan Delanoy
Aug 21 '14 at 8:33
$begingroup$
I will add information
$endgroup$
– Hesam
Aug 21 '14 at 8:40
$begingroup$
I have added the details.
$endgroup$
– Hesam
Aug 21 '14 at 23:49
5
$begingroup$
I believe this is equivalent to finding invariant subspaces common to both matrices.
$endgroup$
– Victor Liu
Aug 22 '14 at 0:20
1
$begingroup$
do they share a common eigenvalue?
$endgroup$
– Mustafa Said
Aug 22 '14 at 0:29
|
show 2 more comments
$begingroup$
There are two matrices $A$ and $B$ which can not be diagonalized simultaneously. Is it possible to block diagonalize them? What if the matrices have an special pattern?
Physics of the problem is that, I have $2$ layers of atoms where $A$ is connectivity within the layer $1$ itself and $B$ is connectivity between layer $1$ and $2$.
By block diagoanl matrix I mean a matrix which its diagonals are some matrices with size $M times M$ (where M is much smaller than the actual size of matrix but larger than 1)
For example a $3 times 3$ block diagonal matrix
[ begin{array}{ccc}
[a] & [0] & [0] \
[0] & [b] & [0]\
[0] & [0] & [c] end{array}
]
I have attached the figures of sparsity pattern of an example $A$ and $B$.
matrices eigenvalues-eigenvectors diagonalization block-matrices eigenfunctions
$endgroup$
There are two matrices $A$ and $B$ which can not be diagonalized simultaneously. Is it possible to block diagonalize them? What if the matrices have an special pattern?
Physics of the problem is that, I have $2$ layers of atoms where $A$ is connectivity within the layer $1$ itself and $B$ is connectivity between layer $1$ and $2$.
By block diagoanl matrix I mean a matrix which its diagonals are some matrices with size $M times M$ (where M is much smaller than the actual size of matrix but larger than 1)
For example a $3 times 3$ block diagonal matrix
[ begin{array}{ccc}
[a] & [0] & [0] \
[0] & [b] & [0]\
[0] & [0] & [c] end{array}
]
I have attached the figures of sparsity pattern of an example $A$ and $B$.
matrices eigenvalues-eigenvectors diagonalization block-matrices eigenfunctions
matrices eigenvalues-eigenvectors diagonalization block-matrices eigenfunctions
edited Aug 22 '14 at 0:17
Hesam
asked Aug 10 '14 at 7:01
HesamHesam
8310
8310
2
$begingroup$
Perhaps your question will make a lot of sense to physicists, but I think you did not take enough time to translate it into something understandable by mathematicians. What do you mean by "block diagonalize" ? Perhaps you could point to a wikipedia link explaining what the "connectivity" is.
$endgroup$
– Ewan Delanoy
Aug 21 '14 at 8:33
$begingroup$
I will add information
$endgroup$
– Hesam
Aug 21 '14 at 8:40
$begingroup$
I have added the details.
$endgroup$
– Hesam
Aug 21 '14 at 23:49
5
$begingroup$
I believe this is equivalent to finding invariant subspaces common to both matrices.
$endgroup$
– Victor Liu
Aug 22 '14 at 0:20
1
$begingroup$
do they share a common eigenvalue?
$endgroup$
– Mustafa Said
Aug 22 '14 at 0:29
|
show 2 more comments
2
$begingroup$
Perhaps your question will make a lot of sense to physicists, but I think you did not take enough time to translate it into something understandable by mathematicians. What do you mean by "block diagonalize" ? Perhaps you could point to a wikipedia link explaining what the "connectivity" is.
$endgroup$
– Ewan Delanoy
Aug 21 '14 at 8:33
$begingroup$
I will add information
$endgroup$
– Hesam
Aug 21 '14 at 8:40
$begingroup$
I have added the details.
$endgroup$
– Hesam
Aug 21 '14 at 23:49
5
$begingroup$
I believe this is equivalent to finding invariant subspaces common to both matrices.
$endgroup$
– Victor Liu
Aug 22 '14 at 0:20
1
$begingroup$
do they share a common eigenvalue?
$endgroup$
– Mustafa Said
Aug 22 '14 at 0:29
2
2
$begingroup$
Perhaps your question will make a lot of sense to physicists, but I think you did not take enough time to translate it into something understandable by mathematicians. What do you mean by "block diagonalize" ? Perhaps you could point to a wikipedia link explaining what the "connectivity" is.
$endgroup$
– Ewan Delanoy
Aug 21 '14 at 8:33
$begingroup$
Perhaps your question will make a lot of sense to physicists, but I think you did not take enough time to translate it into something understandable by mathematicians. What do you mean by "block diagonalize" ? Perhaps you could point to a wikipedia link explaining what the "connectivity" is.
$endgroup$
– Ewan Delanoy
Aug 21 '14 at 8:33
$begingroup$
I will add information
$endgroup$
– Hesam
Aug 21 '14 at 8:40
$begingroup$
I will add information
$endgroup$
– Hesam
Aug 21 '14 at 8:40
$begingroup$
I have added the details.
$endgroup$
– Hesam
Aug 21 '14 at 23:49
$begingroup$
I have added the details.
$endgroup$
– Hesam
Aug 21 '14 at 23:49
5
5
$begingroup$
I believe this is equivalent to finding invariant subspaces common to both matrices.
$endgroup$
– Victor Liu
Aug 22 '14 at 0:20
$begingroup$
I believe this is equivalent to finding invariant subspaces common to both matrices.
$endgroup$
– Victor Liu
Aug 22 '14 at 0:20
1
1
$begingroup$
do they share a common eigenvalue?
$endgroup$
– Mustafa Said
Aug 22 '14 at 0:29
$begingroup$
do they share a common eigenvalue?
$endgroup$
– Mustafa Said
Aug 22 '14 at 0:29
|
show 2 more comments
1 Answer
1
active
oldest
votes
$begingroup$
I will propose a method for finding the optimal simultaneous block-diagonalization of two matrices $A$ and $B$, assuming that $A$ is diagonalizable (and eigenvalues are not too degenerate).
(Something similar may work with the Jordan normal form of $A$ as well.)
By optimal I mean that none of the corresponding blocks are simultaneously block-diagonalizable.
Note that using the trivial block-diagonalization $A=(A)$ and $B=(B)$ this can always be done.
Let me start by verifying Victor Liu's comment that the problem is equivalent to expressing your vector space as a direct sum of invariant spaces.
I will write $Coplus D=begin{pmatrix}C&0\0&Dend{pmatrix}$ and similarly for several matrices.
I assume that the matrices operate in the vector space $V=mathbb R^n$ (or $mathbb C^n$, this is irrelevant).
If the two matrices $A$ and $B$ can be block-diagonalized simultaneously, $A=A_1oplus A_2$ and $B=B_1oplus B_2$, then $V$ is a direct sum $V=V_1oplus V_2$, where the spaces $V_i$ are invariant under $A$ and $B$.
A subspace $Usubset V$ is said to be invariant under $A$ if $A(U)subset U$.
Similarly, if $V$ is a direct sum of subspaces invariant under both $A$ and $B$, then $A$ and $B$ can be simultaneously block-diagonalized.
Now what are the subspaces of $V$ that are invariant with respect to both $A$ and $B$?
Suppose $Uneq{0}$ is such a subspace.
Suppose furthermore that there is another subspace $U'subset V$ so that $U$ and $U'$ only intersect at the origin but they span all of $V$: $V=Uoplus U'$.
(In other words, $U$ is an invariant subspace complementable by an invariant subspace. The spaces in the decomposition need to be of this type.)
Now $U$ is invariant under $A$.
Let us first diagonalize $A$.
Let $lambda_1,dots,lambda_m$ be the eigenvalues and $V_1,dots,V_m$ the corresponding eigenspaces.
We can thus write $V$ as a direct sum $V=V_1oplusdotsoplus V_m$.
Since $U$, it is necessarily of the form $U=U_1oplusdotsoplus U_m$, where $U_i$ is a subspace of $V_i$ (can be zero or the whole thing).
Suppose one of the eigenvalues of $A$, $lambda_1$, is simple (perhaps the ground state if you have a quantum mechanical system).
Then $V_1=langle v_1rangle$ has to belong to $U$ or its complementing space; suppose it belongs to $U$.
We also need to have $Bv_1in U$.
Now $Bv_1=x_1oplusdotsoplus x_min V_1oplusdotsoplus V_m$.
By the direct sum form of $U$, we must have $x_iin U$ for all $i$.
Now we have a list of new vectors $x_1,dots,x_m$ in $U$.
For each of them we can run the same argument: $Bx_i=dotsin V_1oplusdotsoplus V_m$ must be in $U$ and so must each of its components (w.r.t. the eigenspace decomposition).
Differently put, if $pi_I:Vto V_i$ is the orthogonal projection to the eigenspace, the following is true: If $uin U$, then $pi_iBuin U$ for all $i$.
It can be enlightening to calculate the matrices $pi_iBu$.
(You could probably see something at glance, but I have not developed intuition for it.)
This is a (at least computationally feasible) method to find an invariant subspace $U$ containing $V_1$.
It can well happen that $U=V$ and only the trivial simultaneous block-diagonalization exists.
If not, after some amount of iteration you will notice that the above procedure no longer produces more linearly independent vectors and $U$ is a nontrivial invariant subspace.
It does not yet follow that $U$ would be complementable by an invariant subspace.
Assuming there is another simple eigenvalue (whose eigenspace is not contained in $U$), you can start the same construction with that one and produce a new invariant subspace $U'$.
You will automatically have $Ucap U'={0}$.
You can also start at any eigenvector of $B$ corresponding to a simple eigenvalue.
If you somehow know that a particular vector needs to be in an invariant subspace, you can use it as well.
(But be careful: an eigenvector corresponding to a degenerate eigenvalue need not be in an invariant subspace.)
With this method you can generate some amount of invariant subspaces.
If these subspaces span the whole space, the corresponding block-diagonalization is optimal.
If they do not span all of $V$, at least the rest of the problem is easier.
(Note that even if $A$ is symmetric, a complementing invariant subspace need not be orthogonal to $U$.)
This is not a complete answer to your question, but perhaps you can solve your problem with something along these lines.
And do ask if you need clarification.
Further details on block diagonalization can be found in this follow-up question.
$endgroup$
$begingroup$
As you specified based on the coment "equivalent to finding invariant subspaces common to both matrices" by Victor Liu, I want to know more about this equivalency. Is there any reference about this theorem? Moreover, it is a bit difficult to follow your answer for finding such subspaces. Would you please give more elaboration or give any reference about that. Thanks.
$endgroup$
– Amin
Jan 6 at 12:47
$begingroup$
@Amin I recommend that you ask a separate question about that. I can give an answer, but someone else might beat me to it. Explaining that point will take some space and is well worth a new question.
$endgroup$
– Joonas Ilmavirta
Jan 6 at 20:20
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f892723%2fblock-diagonalizing-two-matrices-simultaneously%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
I will propose a method for finding the optimal simultaneous block-diagonalization of two matrices $A$ and $B$, assuming that $A$ is diagonalizable (and eigenvalues are not too degenerate).
(Something similar may work with the Jordan normal form of $A$ as well.)
By optimal I mean that none of the corresponding blocks are simultaneously block-diagonalizable.
Note that using the trivial block-diagonalization $A=(A)$ and $B=(B)$ this can always be done.
Let me start by verifying Victor Liu's comment that the problem is equivalent to expressing your vector space as a direct sum of invariant spaces.
I will write $Coplus D=begin{pmatrix}C&0\0&Dend{pmatrix}$ and similarly for several matrices.
I assume that the matrices operate in the vector space $V=mathbb R^n$ (or $mathbb C^n$, this is irrelevant).
If the two matrices $A$ and $B$ can be block-diagonalized simultaneously, $A=A_1oplus A_2$ and $B=B_1oplus B_2$, then $V$ is a direct sum $V=V_1oplus V_2$, where the spaces $V_i$ are invariant under $A$ and $B$.
A subspace $Usubset V$ is said to be invariant under $A$ if $A(U)subset U$.
Similarly, if $V$ is a direct sum of subspaces invariant under both $A$ and $B$, then $A$ and $B$ can be simultaneously block-diagonalized.
Now what are the subspaces of $V$ that are invariant with respect to both $A$ and $B$?
Suppose $Uneq{0}$ is such a subspace.
Suppose furthermore that there is another subspace $U'subset V$ so that $U$ and $U'$ only intersect at the origin but they span all of $V$: $V=Uoplus U'$.
(In other words, $U$ is an invariant subspace complementable by an invariant subspace. The spaces in the decomposition need to be of this type.)
Now $U$ is invariant under $A$.
Let us first diagonalize $A$.
Let $lambda_1,dots,lambda_m$ be the eigenvalues and $V_1,dots,V_m$ the corresponding eigenspaces.
We can thus write $V$ as a direct sum $V=V_1oplusdotsoplus V_m$.
Since $U$, it is necessarily of the form $U=U_1oplusdotsoplus U_m$, where $U_i$ is a subspace of $V_i$ (can be zero or the whole thing).
Suppose one of the eigenvalues of $A$, $lambda_1$, is simple (perhaps the ground state if you have a quantum mechanical system).
Then $V_1=langle v_1rangle$ has to belong to $U$ or its complementing space; suppose it belongs to $U$.
We also need to have $Bv_1in U$.
Now $Bv_1=x_1oplusdotsoplus x_min V_1oplusdotsoplus V_m$.
By the direct sum form of $U$, we must have $x_iin U$ for all $i$.
Now we have a list of new vectors $x_1,dots,x_m$ in $U$.
For each of them we can run the same argument: $Bx_i=dotsin V_1oplusdotsoplus V_m$ must be in $U$ and so must each of its components (w.r.t. the eigenspace decomposition).
Differently put, if $pi_I:Vto V_i$ is the orthogonal projection to the eigenspace, the following is true: If $uin U$, then $pi_iBuin U$ for all $i$.
It can be enlightening to calculate the matrices $pi_iBu$.
(You could probably see something at glance, but I have not developed intuition for it.)
This is a (at least computationally feasible) method to find an invariant subspace $U$ containing $V_1$.
It can well happen that $U=V$ and only the trivial simultaneous block-diagonalization exists.
If not, after some amount of iteration you will notice that the above procedure no longer produces more linearly independent vectors and $U$ is a nontrivial invariant subspace.
It does not yet follow that $U$ would be complementable by an invariant subspace.
Assuming there is another simple eigenvalue (whose eigenspace is not contained in $U$), you can start the same construction with that one and produce a new invariant subspace $U'$.
You will automatically have $Ucap U'={0}$.
You can also start at any eigenvector of $B$ corresponding to a simple eigenvalue.
If you somehow know that a particular vector needs to be in an invariant subspace, you can use it as well.
(But be careful: an eigenvector corresponding to a degenerate eigenvalue need not be in an invariant subspace.)
With this method you can generate some amount of invariant subspaces.
If these subspaces span the whole space, the corresponding block-diagonalization is optimal.
If they do not span all of $V$, at least the rest of the problem is easier.
(Note that even if $A$ is symmetric, a complementing invariant subspace need not be orthogonal to $U$.)
This is not a complete answer to your question, but perhaps you can solve your problem with something along these lines.
And do ask if you need clarification.
Further details on block diagonalization can be found in this follow-up question.
$endgroup$
$begingroup$
As you specified based on the coment "equivalent to finding invariant subspaces common to both matrices" by Victor Liu, I want to know more about this equivalency. Is there any reference about this theorem? Moreover, it is a bit difficult to follow your answer for finding such subspaces. Would you please give more elaboration or give any reference about that. Thanks.
$endgroup$
– Amin
Jan 6 at 12:47
$begingroup$
@Amin I recommend that you ask a separate question about that. I can give an answer, but someone else might beat me to it. Explaining that point will take some space and is well worth a new question.
$endgroup$
– Joonas Ilmavirta
Jan 6 at 20:20
add a comment |
$begingroup$
I will propose a method for finding the optimal simultaneous block-diagonalization of two matrices $A$ and $B$, assuming that $A$ is diagonalizable (and eigenvalues are not too degenerate).
(Something similar may work with the Jordan normal form of $A$ as well.)
By optimal I mean that none of the corresponding blocks are simultaneously block-diagonalizable.
Note that using the trivial block-diagonalization $A=(A)$ and $B=(B)$ this can always be done.
Let me start by verifying Victor Liu's comment that the problem is equivalent to expressing your vector space as a direct sum of invariant spaces.
I will write $Coplus D=begin{pmatrix}C&0\0&Dend{pmatrix}$ and similarly for several matrices.
I assume that the matrices operate in the vector space $V=mathbb R^n$ (or $mathbb C^n$, this is irrelevant).
If the two matrices $A$ and $B$ can be block-diagonalized simultaneously, $A=A_1oplus A_2$ and $B=B_1oplus B_2$, then $V$ is a direct sum $V=V_1oplus V_2$, where the spaces $V_i$ are invariant under $A$ and $B$.
A subspace $Usubset V$ is said to be invariant under $A$ if $A(U)subset U$.
Similarly, if $V$ is a direct sum of subspaces invariant under both $A$ and $B$, then $A$ and $B$ can be simultaneously block-diagonalized.
Now what are the subspaces of $V$ that are invariant with respect to both $A$ and $B$?
Suppose $Uneq{0}$ is such a subspace.
Suppose furthermore that there is another subspace $U'subset V$ so that $U$ and $U'$ only intersect at the origin but they span all of $V$: $V=Uoplus U'$.
(In other words, $U$ is an invariant subspace complementable by an invariant subspace. The spaces in the decomposition need to be of this type.)
Now $U$ is invariant under $A$.
Let us first diagonalize $A$.
Let $lambda_1,dots,lambda_m$ be the eigenvalues and $V_1,dots,V_m$ the corresponding eigenspaces.
We can thus write $V$ as a direct sum $V=V_1oplusdotsoplus V_m$.
Since $U$, it is necessarily of the form $U=U_1oplusdotsoplus U_m$, where $U_i$ is a subspace of $V_i$ (can be zero or the whole thing).
Suppose one of the eigenvalues of $A$, $lambda_1$, is simple (perhaps the ground state if you have a quantum mechanical system).
Then $V_1=langle v_1rangle$ has to belong to $U$ or its complementing space; suppose it belongs to $U$.
We also need to have $Bv_1in U$.
Now $Bv_1=x_1oplusdotsoplus x_min V_1oplusdotsoplus V_m$.
By the direct sum form of $U$, we must have $x_iin U$ for all $i$.
Now we have a list of new vectors $x_1,dots,x_m$ in $U$.
For each of them we can run the same argument: $Bx_i=dotsin V_1oplusdotsoplus V_m$ must be in $U$ and so must each of its components (w.r.t. the eigenspace decomposition).
Differently put, if $pi_I:Vto V_i$ is the orthogonal projection to the eigenspace, the following is true: If $uin U$, then $pi_iBuin U$ for all $i$.
It can be enlightening to calculate the matrices $pi_iBu$.
(You could probably see something at glance, but I have not developed intuition for it.)
This is a (at least computationally feasible) method to find an invariant subspace $U$ containing $V_1$.
It can well happen that $U=V$ and only the trivial simultaneous block-diagonalization exists.
If not, after some amount of iteration you will notice that the above procedure no longer produces more linearly independent vectors and $U$ is a nontrivial invariant subspace.
It does not yet follow that $U$ would be complementable by an invariant subspace.
Assuming there is another simple eigenvalue (whose eigenspace is not contained in $U$), you can start the same construction with that one and produce a new invariant subspace $U'$.
You will automatically have $Ucap U'={0}$.
You can also start at any eigenvector of $B$ corresponding to a simple eigenvalue.
If you somehow know that a particular vector needs to be in an invariant subspace, you can use it as well.
(But be careful: an eigenvector corresponding to a degenerate eigenvalue need not be in an invariant subspace.)
With this method you can generate some amount of invariant subspaces.
If these subspaces span the whole space, the corresponding block-diagonalization is optimal.
If they do not span all of $V$, at least the rest of the problem is easier.
(Note that even if $A$ is symmetric, a complementing invariant subspace need not be orthogonal to $U$.)
This is not a complete answer to your question, but perhaps you can solve your problem with something along these lines.
And do ask if you need clarification.
Further details on block diagonalization can be found in this follow-up question.
$endgroup$
$begingroup$
As you specified based on the coment "equivalent to finding invariant subspaces common to both matrices" by Victor Liu, I want to know more about this equivalency. Is there any reference about this theorem? Moreover, it is a bit difficult to follow your answer for finding such subspaces. Would you please give more elaboration or give any reference about that. Thanks.
$endgroup$
– Amin
Jan 6 at 12:47
$begingroup$
@Amin I recommend that you ask a separate question about that. I can give an answer, but someone else might beat me to it. Explaining that point will take some space and is well worth a new question.
$endgroup$
– Joonas Ilmavirta
Jan 6 at 20:20
add a comment |
$begingroup$
I will propose a method for finding the optimal simultaneous block-diagonalization of two matrices $A$ and $B$, assuming that $A$ is diagonalizable (and eigenvalues are not too degenerate).
(Something similar may work with the Jordan normal form of $A$ as well.)
By optimal I mean that none of the corresponding blocks are simultaneously block-diagonalizable.
Note that using the trivial block-diagonalization $A=(A)$ and $B=(B)$ this can always be done.
Let me start by verifying Victor Liu's comment that the problem is equivalent to expressing your vector space as a direct sum of invariant spaces.
I will write $Coplus D=begin{pmatrix}C&0\0&Dend{pmatrix}$ and similarly for several matrices.
I assume that the matrices operate in the vector space $V=mathbb R^n$ (or $mathbb C^n$, this is irrelevant).
If the two matrices $A$ and $B$ can be block-diagonalized simultaneously, $A=A_1oplus A_2$ and $B=B_1oplus B_2$, then $V$ is a direct sum $V=V_1oplus V_2$, where the spaces $V_i$ are invariant under $A$ and $B$.
A subspace $Usubset V$ is said to be invariant under $A$ if $A(U)subset U$.
Similarly, if $V$ is a direct sum of subspaces invariant under both $A$ and $B$, then $A$ and $B$ can be simultaneously block-diagonalized.
Now what are the subspaces of $V$ that are invariant with respect to both $A$ and $B$?
Suppose $Uneq{0}$ is such a subspace.
Suppose furthermore that there is another subspace $U'subset V$ so that $U$ and $U'$ only intersect at the origin but they span all of $V$: $V=Uoplus U'$.
(In other words, $U$ is an invariant subspace complementable by an invariant subspace. The spaces in the decomposition need to be of this type.)
Now $U$ is invariant under $A$.
Let us first diagonalize $A$.
Let $lambda_1,dots,lambda_m$ be the eigenvalues and $V_1,dots,V_m$ the corresponding eigenspaces.
We can thus write $V$ as a direct sum $V=V_1oplusdotsoplus V_m$.
Since $U$, it is necessarily of the form $U=U_1oplusdotsoplus U_m$, where $U_i$ is a subspace of $V_i$ (can be zero or the whole thing).
Suppose one of the eigenvalues of $A$, $lambda_1$, is simple (perhaps the ground state if you have a quantum mechanical system).
Then $V_1=langle v_1rangle$ has to belong to $U$ or its complementing space; suppose it belongs to $U$.
We also need to have $Bv_1in U$.
Now $Bv_1=x_1oplusdotsoplus x_min V_1oplusdotsoplus V_m$.
By the direct sum form of $U$, we must have $x_iin U$ for all $i$.
Now we have a list of new vectors $x_1,dots,x_m$ in $U$.
For each of them we can run the same argument: $Bx_i=dotsin V_1oplusdotsoplus V_m$ must be in $U$ and so must each of its components (w.r.t. the eigenspace decomposition).
Differently put, if $pi_I:Vto V_i$ is the orthogonal projection to the eigenspace, the following is true: If $uin U$, then $pi_iBuin U$ for all $i$.
It can be enlightening to calculate the matrices $pi_iBu$.
(You could probably see something at glance, but I have not developed intuition for it.)
This is a (at least computationally feasible) method to find an invariant subspace $U$ containing $V_1$.
It can well happen that $U=V$ and only the trivial simultaneous block-diagonalization exists.
If not, after some amount of iteration you will notice that the above procedure no longer produces more linearly independent vectors and $U$ is a nontrivial invariant subspace.
It does not yet follow that $U$ would be complementable by an invariant subspace.
Assuming there is another simple eigenvalue (whose eigenspace is not contained in $U$), you can start the same construction with that one and produce a new invariant subspace $U'$.
You will automatically have $Ucap U'={0}$.
You can also start at any eigenvector of $B$ corresponding to a simple eigenvalue.
If you somehow know that a particular vector needs to be in an invariant subspace, you can use it as well.
(But be careful: an eigenvector corresponding to a degenerate eigenvalue need not be in an invariant subspace.)
With this method you can generate some amount of invariant subspaces.
If these subspaces span the whole space, the corresponding block-diagonalization is optimal.
If they do not span all of $V$, at least the rest of the problem is easier.
(Note that even if $A$ is symmetric, a complementing invariant subspace need not be orthogonal to $U$.)
This is not a complete answer to your question, but perhaps you can solve your problem with something along these lines.
And do ask if you need clarification.
Further details on block diagonalization can be found in this follow-up question.
$endgroup$
I will propose a method for finding the optimal simultaneous block-diagonalization of two matrices $A$ and $B$, assuming that $A$ is diagonalizable (and eigenvalues are not too degenerate).
(Something similar may work with the Jordan normal form of $A$ as well.)
By optimal I mean that none of the corresponding blocks are simultaneously block-diagonalizable.
Note that using the trivial block-diagonalization $A=(A)$ and $B=(B)$ this can always be done.
Let me start by verifying Victor Liu's comment that the problem is equivalent to expressing your vector space as a direct sum of invariant spaces.
I will write $Coplus D=begin{pmatrix}C&0\0&Dend{pmatrix}$ and similarly for several matrices.
I assume that the matrices operate in the vector space $V=mathbb R^n$ (or $mathbb C^n$, this is irrelevant).
If the two matrices $A$ and $B$ can be block-diagonalized simultaneously, $A=A_1oplus A_2$ and $B=B_1oplus B_2$, then $V$ is a direct sum $V=V_1oplus V_2$, where the spaces $V_i$ are invariant under $A$ and $B$.
A subspace $Usubset V$ is said to be invariant under $A$ if $A(U)subset U$.
Similarly, if $V$ is a direct sum of subspaces invariant under both $A$ and $B$, then $A$ and $B$ can be simultaneously block-diagonalized.
Now what are the subspaces of $V$ that are invariant with respect to both $A$ and $B$?
Suppose $Uneq{0}$ is such a subspace.
Suppose furthermore that there is another subspace $U'subset V$ so that $U$ and $U'$ only intersect at the origin but they span all of $V$: $V=Uoplus U'$.
(In other words, $U$ is an invariant subspace complementable by an invariant subspace. The spaces in the decomposition need to be of this type.)
Now $U$ is invariant under $A$.
Let us first diagonalize $A$.
Let $lambda_1,dots,lambda_m$ be the eigenvalues and $V_1,dots,V_m$ the corresponding eigenspaces.
We can thus write $V$ as a direct sum $V=V_1oplusdotsoplus V_m$.
Since $U$, it is necessarily of the form $U=U_1oplusdotsoplus U_m$, where $U_i$ is a subspace of $V_i$ (can be zero or the whole thing).
Suppose one of the eigenvalues of $A$, $lambda_1$, is simple (perhaps the ground state if you have a quantum mechanical system).
Then $V_1=langle v_1rangle$ has to belong to $U$ or its complementing space; suppose it belongs to $U$.
We also need to have $Bv_1in U$.
Now $Bv_1=x_1oplusdotsoplus x_min V_1oplusdotsoplus V_m$.
By the direct sum form of $U$, we must have $x_iin U$ for all $i$.
Now we have a list of new vectors $x_1,dots,x_m$ in $U$.
For each of them we can run the same argument: $Bx_i=dotsin V_1oplusdotsoplus V_m$ must be in $U$ and so must each of its components (w.r.t. the eigenspace decomposition).
Differently put, if $pi_I:Vto V_i$ is the orthogonal projection to the eigenspace, the following is true: If $uin U$, then $pi_iBuin U$ for all $i$.
It can be enlightening to calculate the matrices $pi_iBu$.
(You could probably see something at glance, but I have not developed intuition for it.)
This is a (at least computationally feasible) method to find an invariant subspace $U$ containing $V_1$.
It can well happen that $U=V$ and only the trivial simultaneous block-diagonalization exists.
If not, after some amount of iteration you will notice that the above procedure no longer produces more linearly independent vectors and $U$ is a nontrivial invariant subspace.
It does not yet follow that $U$ would be complementable by an invariant subspace.
Assuming there is another simple eigenvalue (whose eigenspace is not contained in $U$), you can start the same construction with that one and produce a new invariant subspace $U'$.
You will automatically have $Ucap U'={0}$.
You can also start at any eigenvector of $B$ corresponding to a simple eigenvalue.
If you somehow know that a particular vector needs to be in an invariant subspace, you can use it as well.
(But be careful: an eigenvector corresponding to a degenerate eigenvalue need not be in an invariant subspace.)
With this method you can generate some amount of invariant subspaces.
If these subspaces span the whole space, the corresponding block-diagonalization is optimal.
If they do not span all of $V$, at least the rest of the problem is easier.
(Note that even if $A$ is symmetric, a complementing invariant subspace need not be orthogonal to $U$.)
This is not a complete answer to your question, but perhaps you can solve your problem with something along these lines.
And do ask if you need clarification.
Further details on block diagonalization can be found in this follow-up question.
edited Jan 7 at 10:24
answered Aug 23 '14 at 16:11


Joonas IlmavirtaJoonas Ilmavirta
20.6k94282
20.6k94282
$begingroup$
As you specified based on the coment "equivalent to finding invariant subspaces common to both matrices" by Victor Liu, I want to know more about this equivalency. Is there any reference about this theorem? Moreover, it is a bit difficult to follow your answer for finding such subspaces. Would you please give more elaboration or give any reference about that. Thanks.
$endgroup$
– Amin
Jan 6 at 12:47
$begingroup$
@Amin I recommend that you ask a separate question about that. I can give an answer, but someone else might beat me to it. Explaining that point will take some space and is well worth a new question.
$endgroup$
– Joonas Ilmavirta
Jan 6 at 20:20
add a comment |
$begingroup$
As you specified based on the coment "equivalent to finding invariant subspaces common to both matrices" by Victor Liu, I want to know more about this equivalency. Is there any reference about this theorem? Moreover, it is a bit difficult to follow your answer for finding such subspaces. Would you please give more elaboration or give any reference about that. Thanks.
$endgroup$
– Amin
Jan 6 at 12:47
$begingroup$
@Amin I recommend that you ask a separate question about that. I can give an answer, but someone else might beat me to it. Explaining that point will take some space and is well worth a new question.
$endgroup$
– Joonas Ilmavirta
Jan 6 at 20:20
$begingroup$
As you specified based on the coment "equivalent to finding invariant subspaces common to both matrices" by Victor Liu, I want to know more about this equivalency. Is there any reference about this theorem? Moreover, it is a bit difficult to follow your answer for finding such subspaces. Would you please give more elaboration or give any reference about that. Thanks.
$endgroup$
– Amin
Jan 6 at 12:47
$begingroup$
As you specified based on the coment "equivalent to finding invariant subspaces common to both matrices" by Victor Liu, I want to know more about this equivalency. Is there any reference about this theorem? Moreover, it is a bit difficult to follow your answer for finding such subspaces. Would you please give more elaboration or give any reference about that. Thanks.
$endgroup$
– Amin
Jan 6 at 12:47
$begingroup$
@Amin I recommend that you ask a separate question about that. I can give an answer, but someone else might beat me to it. Explaining that point will take some space and is well worth a new question.
$endgroup$
– Joonas Ilmavirta
Jan 6 at 20:20
$begingroup$
@Amin I recommend that you ask a separate question about that. I can give an answer, but someone else might beat me to it. Explaining that point will take some space and is well worth a new question.
$endgroup$
– Joonas Ilmavirta
Jan 6 at 20:20
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f892723%2fblock-diagonalizing-two-matrices-simultaneously%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
2
$begingroup$
Perhaps your question will make a lot of sense to physicists, but I think you did not take enough time to translate it into something understandable by mathematicians. What do you mean by "block diagonalize" ? Perhaps you could point to a wikipedia link explaining what the "connectivity" is.
$endgroup$
– Ewan Delanoy
Aug 21 '14 at 8:33
$begingroup$
I will add information
$endgroup$
– Hesam
Aug 21 '14 at 8:40
$begingroup$
I have added the details.
$endgroup$
– Hesam
Aug 21 '14 at 23:49
5
$begingroup$
I believe this is equivalent to finding invariant subspaces common to both matrices.
$endgroup$
– Victor Liu
Aug 22 '14 at 0:20
1
$begingroup$
do they share a common eigenvalue?
$endgroup$
– Mustafa Said
Aug 22 '14 at 0:29