If $A^k$ commutes with $B$ then $A$ commutes with $B$.
up vote
10
down vote
favorite
Let $A$ and $B$ are two $n times n$ Complex matrices. Assume that $(A-I)^n=0$ and $A^kB=BA^k$ for some $k in mathbb{N}$. Then I want to prove that $AB= BA$.
Clearly $1$ is the only eigen value of $A$ and also $A^k$ and $B$ are simultaneously triangulable. But how do I get down to $A$ to commute with $B$. Any help will be appreciated. Thanks.
linear-algebra abstract-algebra matrices field-theory
add a comment |
up vote
10
down vote
favorite
Let $A$ and $B$ are two $n times n$ Complex matrices. Assume that $(A-I)^n=0$ and $A^kB=BA^k$ for some $k in mathbb{N}$. Then I want to prove that $AB= BA$.
Clearly $1$ is the only eigen value of $A$ and also $A^k$ and $B$ are simultaneously triangulable. But how do I get down to $A$ to commute with $B$. Any help will be appreciated. Thanks.
linear-algebra abstract-algebra matrices field-theory
2
Note that $mathbb{N}$ has to mean $left{1,2,3,ldotsright}$ for this to be correct.
– darij grinberg
19 hours ago
add a comment |
up vote
10
down vote
favorite
up vote
10
down vote
favorite
Let $A$ and $B$ are two $n times n$ Complex matrices. Assume that $(A-I)^n=0$ and $A^kB=BA^k$ for some $k in mathbb{N}$. Then I want to prove that $AB= BA$.
Clearly $1$ is the only eigen value of $A$ and also $A^k$ and $B$ are simultaneously triangulable. But how do I get down to $A$ to commute with $B$. Any help will be appreciated. Thanks.
linear-algebra abstract-algebra matrices field-theory
Let $A$ and $B$ are two $n times n$ Complex matrices. Assume that $(A-I)^n=0$ and $A^kB=BA^k$ for some $k in mathbb{N}$. Then I want to prove that $AB= BA$.
Clearly $1$ is the only eigen value of $A$ and also $A^k$ and $B$ are simultaneously triangulable. But how do I get down to $A$ to commute with $B$. Any help will be appreciated. Thanks.
linear-algebra abstract-algebra matrices field-theory
linear-algebra abstract-algebra matrices field-theory
asked yesterday
Panja
286213
286213
2
Note that $mathbb{N}$ has to mean $left{1,2,3,ldotsright}$ for this to be correct.
– darij grinberg
19 hours ago
add a comment |
2
Note that $mathbb{N}$ has to mean $left{1,2,3,ldotsright}$ for this to be correct.
– darij grinberg
19 hours ago
2
2
Note that $mathbb{N}$ has to mean $left{1,2,3,ldotsright}$ for this to be correct.
– darij grinberg
19 hours ago
Note that $mathbb{N}$ has to mean $left{1,2,3,ldotsright}$ for this to be correct.
– darij grinberg
19 hours ago
add a comment |
2 Answers
2
active
oldest
votes
up vote
8
down vote
accepted
Hint. Try to prove that $A$ is a polynomial in $A^k$.
Edit. To prove the hint, you may follow darij grinberg's comment below. I did essentially the same thing, but from a matrix analytic rather than linear algebraic perspective: I considered the primary matrix function $f(X)=(I+X)^{1/k}=sum_{i=0}^infty frac{f^{(i)}(0)}{k!}X^i$ for a nilpotent matrix $X$ associated with the scalar function $f(x)=(1+x)^{1/k}$. Put $X=A^k-I$ and we are done.
Alternatively, the hint can be proved using only Jordan forms, but the argument is much longer.
- Let $J$ be the Jordan form of $A$. Since all eigenvalues of $A$ are ones, we may write $J=J_{m_1}oplus J_{m_2}opluscdotsoplus J_{m_b}$, where $1le m_1le m_2lecdotsle m_b$ and $J_m$ denotes a Jordan block of size $m$ for the eigenvalue $1$.
- Note that if $rge0$ and $m<n$, then $J_m^r$ coincides with the leading principal $mtimes m$ submatrix of $J_n^r$. Hence $p(J_m^k)$ coincides with a leading principal submatrix of $p(J_n^k)$ for any polynomial $p$.
- It follows that if $p(J_n^k)=J_n$, then $p(J_m^k)=J_m$ for every $m<n$. This is true in particular when $min{m_1,m_2,ldots,m_b}$. Consequently, $p(A^k)=A$.
- Hence the problem boils down to finding a polynomial $p$ such that $p(J_n^k)=J_n$. This should be straightforward and I will leave it to you.
1
I was trying to prove your hint. I was thinking that if I can show that $A^k$ has a cyclic vector then since $A^k$ commutes with $A$, $A$ would have been a polynomial in $A^k$. But it may not be the case that $A^k$ has a cyclic vector. Can you kindly give me one more hint to prove your hint ?
– Panja
yesterday
5
My favorite way of proving the hint is to recall Newton's binomial formula $left(1+xright)^r = sumlimits_{i=0}^{infty} dbinom{r}{i} x^i$ for all $r in mathbb{Q}$. This is, per se, an equality between formal power series in the indeterminate $x$ over $mathbb{Q}$. But since the matrix $A^k - I$ is nilpotent (make sure you understand why!), we can substitute $A^k - I$ for $x$ in this formula (make sure you understand why!), and apply it to $r = 1/k$, thus obtaining $A$ on the left hand side (make sure you understand how!), and on the right hand side a polynomial in $A^k - I$.
– darij grinberg
19 hours ago
1
I never knew you could put non-integers into binomial coefficients. Thanks for sharing!
– user25959
14 hours ago
1
@Song There is no need to do this. Since $X$ is nilpotent, the infinite sum is actually a finite sum (or more specifically, a sum of no more than $n$ terms).
– user1551
12 hours ago
1
@Song $A-I$ is nilpotent. Therefore $1$ is an eigenvalue fo $A$ of multiplicity $n$. Hence $1$ is an eigenvalue of $A^k$ of multiplicity $n$, meaning that $A^k$ is nilpotent. You may also triangularise $A$ first to see this.
– user1551
12 hours ago
|
show 4 more comments
up vote
0
down vote
If $A$ has only $1$ as eigenvalue then it is invertible (and of the form $I+N$ where $N$ is nilpotent with $N^n=0$).
$A $ is satisfying also any equation $(A^i-I)^n=0$ so we can write
$(A-I)^n=0, (A^2-I)^n=0, dots (A^k-I)^n=0$.
Also we can write ${A^k} B(A^k)^{-1}=B$.
In similar fashion
${A^k} B(A^k)^{-1}=({A^k})^{-1} B(A^k)$
${A^{2k}} B = B(A^{2k})$, etc..
For any natural $m$:
${A^{mk}} B = B(A^{mk})$
If powers $(A^k)^m$ could be a basis for expression of $A$ then commutativity would follow...
add a comment |
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
8
down vote
accepted
Hint. Try to prove that $A$ is a polynomial in $A^k$.
Edit. To prove the hint, you may follow darij grinberg's comment below. I did essentially the same thing, but from a matrix analytic rather than linear algebraic perspective: I considered the primary matrix function $f(X)=(I+X)^{1/k}=sum_{i=0}^infty frac{f^{(i)}(0)}{k!}X^i$ for a nilpotent matrix $X$ associated with the scalar function $f(x)=(1+x)^{1/k}$. Put $X=A^k-I$ and we are done.
Alternatively, the hint can be proved using only Jordan forms, but the argument is much longer.
- Let $J$ be the Jordan form of $A$. Since all eigenvalues of $A$ are ones, we may write $J=J_{m_1}oplus J_{m_2}opluscdotsoplus J_{m_b}$, where $1le m_1le m_2lecdotsle m_b$ and $J_m$ denotes a Jordan block of size $m$ for the eigenvalue $1$.
- Note that if $rge0$ and $m<n$, then $J_m^r$ coincides with the leading principal $mtimes m$ submatrix of $J_n^r$. Hence $p(J_m^k)$ coincides with a leading principal submatrix of $p(J_n^k)$ for any polynomial $p$.
- It follows that if $p(J_n^k)=J_n$, then $p(J_m^k)=J_m$ for every $m<n$. This is true in particular when $min{m_1,m_2,ldots,m_b}$. Consequently, $p(A^k)=A$.
- Hence the problem boils down to finding a polynomial $p$ such that $p(J_n^k)=J_n$. This should be straightforward and I will leave it to you.
1
I was trying to prove your hint. I was thinking that if I can show that $A^k$ has a cyclic vector then since $A^k$ commutes with $A$, $A$ would have been a polynomial in $A^k$. But it may not be the case that $A^k$ has a cyclic vector. Can you kindly give me one more hint to prove your hint ?
– Panja
yesterday
5
My favorite way of proving the hint is to recall Newton's binomial formula $left(1+xright)^r = sumlimits_{i=0}^{infty} dbinom{r}{i} x^i$ for all $r in mathbb{Q}$. This is, per se, an equality between formal power series in the indeterminate $x$ over $mathbb{Q}$. But since the matrix $A^k - I$ is nilpotent (make sure you understand why!), we can substitute $A^k - I$ for $x$ in this formula (make sure you understand why!), and apply it to $r = 1/k$, thus obtaining $A$ on the left hand side (make sure you understand how!), and on the right hand side a polynomial in $A^k - I$.
– darij grinberg
19 hours ago
1
I never knew you could put non-integers into binomial coefficients. Thanks for sharing!
– user25959
14 hours ago
1
@Song There is no need to do this. Since $X$ is nilpotent, the infinite sum is actually a finite sum (or more specifically, a sum of no more than $n$ terms).
– user1551
12 hours ago
1
@Song $A-I$ is nilpotent. Therefore $1$ is an eigenvalue fo $A$ of multiplicity $n$. Hence $1$ is an eigenvalue of $A^k$ of multiplicity $n$, meaning that $A^k$ is nilpotent. You may also triangularise $A$ first to see this.
– user1551
12 hours ago
|
show 4 more comments
up vote
8
down vote
accepted
Hint. Try to prove that $A$ is a polynomial in $A^k$.
Edit. To prove the hint, you may follow darij grinberg's comment below. I did essentially the same thing, but from a matrix analytic rather than linear algebraic perspective: I considered the primary matrix function $f(X)=(I+X)^{1/k}=sum_{i=0}^infty frac{f^{(i)}(0)}{k!}X^i$ for a nilpotent matrix $X$ associated with the scalar function $f(x)=(1+x)^{1/k}$. Put $X=A^k-I$ and we are done.
Alternatively, the hint can be proved using only Jordan forms, but the argument is much longer.
- Let $J$ be the Jordan form of $A$. Since all eigenvalues of $A$ are ones, we may write $J=J_{m_1}oplus J_{m_2}opluscdotsoplus J_{m_b}$, where $1le m_1le m_2lecdotsle m_b$ and $J_m$ denotes a Jordan block of size $m$ for the eigenvalue $1$.
- Note that if $rge0$ and $m<n$, then $J_m^r$ coincides with the leading principal $mtimes m$ submatrix of $J_n^r$. Hence $p(J_m^k)$ coincides with a leading principal submatrix of $p(J_n^k)$ for any polynomial $p$.
- It follows that if $p(J_n^k)=J_n$, then $p(J_m^k)=J_m$ for every $m<n$. This is true in particular when $min{m_1,m_2,ldots,m_b}$. Consequently, $p(A^k)=A$.
- Hence the problem boils down to finding a polynomial $p$ such that $p(J_n^k)=J_n$. This should be straightforward and I will leave it to you.
1
I was trying to prove your hint. I was thinking that if I can show that $A^k$ has a cyclic vector then since $A^k$ commutes with $A$, $A$ would have been a polynomial in $A^k$. But it may not be the case that $A^k$ has a cyclic vector. Can you kindly give me one more hint to prove your hint ?
– Panja
yesterday
5
My favorite way of proving the hint is to recall Newton's binomial formula $left(1+xright)^r = sumlimits_{i=0}^{infty} dbinom{r}{i} x^i$ for all $r in mathbb{Q}$. This is, per se, an equality between formal power series in the indeterminate $x$ over $mathbb{Q}$. But since the matrix $A^k - I$ is nilpotent (make sure you understand why!), we can substitute $A^k - I$ for $x$ in this formula (make sure you understand why!), and apply it to $r = 1/k$, thus obtaining $A$ on the left hand side (make sure you understand how!), and on the right hand side a polynomial in $A^k - I$.
– darij grinberg
19 hours ago
1
I never knew you could put non-integers into binomial coefficients. Thanks for sharing!
– user25959
14 hours ago
1
@Song There is no need to do this. Since $X$ is nilpotent, the infinite sum is actually a finite sum (or more specifically, a sum of no more than $n$ terms).
– user1551
12 hours ago
1
@Song $A-I$ is nilpotent. Therefore $1$ is an eigenvalue fo $A$ of multiplicity $n$. Hence $1$ is an eigenvalue of $A^k$ of multiplicity $n$, meaning that $A^k$ is nilpotent. You may also triangularise $A$ first to see this.
– user1551
12 hours ago
|
show 4 more comments
up vote
8
down vote
accepted
up vote
8
down vote
accepted
Hint. Try to prove that $A$ is a polynomial in $A^k$.
Edit. To prove the hint, you may follow darij grinberg's comment below. I did essentially the same thing, but from a matrix analytic rather than linear algebraic perspective: I considered the primary matrix function $f(X)=(I+X)^{1/k}=sum_{i=0}^infty frac{f^{(i)}(0)}{k!}X^i$ for a nilpotent matrix $X$ associated with the scalar function $f(x)=(1+x)^{1/k}$. Put $X=A^k-I$ and we are done.
Alternatively, the hint can be proved using only Jordan forms, but the argument is much longer.
- Let $J$ be the Jordan form of $A$. Since all eigenvalues of $A$ are ones, we may write $J=J_{m_1}oplus J_{m_2}opluscdotsoplus J_{m_b}$, where $1le m_1le m_2lecdotsle m_b$ and $J_m$ denotes a Jordan block of size $m$ for the eigenvalue $1$.
- Note that if $rge0$ and $m<n$, then $J_m^r$ coincides with the leading principal $mtimes m$ submatrix of $J_n^r$. Hence $p(J_m^k)$ coincides with a leading principal submatrix of $p(J_n^k)$ for any polynomial $p$.
- It follows that if $p(J_n^k)=J_n$, then $p(J_m^k)=J_m$ for every $m<n$. This is true in particular when $min{m_1,m_2,ldots,m_b}$. Consequently, $p(A^k)=A$.
- Hence the problem boils down to finding a polynomial $p$ such that $p(J_n^k)=J_n$. This should be straightforward and I will leave it to you.
Hint. Try to prove that $A$ is a polynomial in $A^k$.
Edit. To prove the hint, you may follow darij grinberg's comment below. I did essentially the same thing, but from a matrix analytic rather than linear algebraic perspective: I considered the primary matrix function $f(X)=(I+X)^{1/k}=sum_{i=0}^infty frac{f^{(i)}(0)}{k!}X^i$ for a nilpotent matrix $X$ associated with the scalar function $f(x)=(1+x)^{1/k}$. Put $X=A^k-I$ and we are done.
Alternatively, the hint can be proved using only Jordan forms, but the argument is much longer.
- Let $J$ be the Jordan form of $A$. Since all eigenvalues of $A$ are ones, we may write $J=J_{m_1}oplus J_{m_2}opluscdotsoplus J_{m_b}$, where $1le m_1le m_2lecdotsle m_b$ and $J_m$ denotes a Jordan block of size $m$ for the eigenvalue $1$.
- Note that if $rge0$ and $m<n$, then $J_m^r$ coincides with the leading principal $mtimes m$ submatrix of $J_n^r$. Hence $p(J_m^k)$ coincides with a leading principal submatrix of $p(J_n^k)$ for any polynomial $p$.
- It follows that if $p(J_n^k)=J_n$, then $p(J_m^k)=J_m$ for every $m<n$. This is true in particular when $min{m_1,m_2,ldots,m_b}$. Consequently, $p(A^k)=A$.
- Hence the problem boils down to finding a polynomial $p$ such that $p(J_n^k)=J_n$. This should be straightforward and I will leave it to you.
edited 12 hours ago
answered yesterday
user1551
70.1k566125
70.1k566125
1
I was trying to prove your hint. I was thinking that if I can show that $A^k$ has a cyclic vector then since $A^k$ commutes with $A$, $A$ would have been a polynomial in $A^k$. But it may not be the case that $A^k$ has a cyclic vector. Can you kindly give me one more hint to prove your hint ?
– Panja
yesterday
5
My favorite way of proving the hint is to recall Newton's binomial formula $left(1+xright)^r = sumlimits_{i=0}^{infty} dbinom{r}{i} x^i$ for all $r in mathbb{Q}$. This is, per se, an equality between formal power series in the indeterminate $x$ over $mathbb{Q}$. But since the matrix $A^k - I$ is nilpotent (make sure you understand why!), we can substitute $A^k - I$ for $x$ in this formula (make sure you understand why!), and apply it to $r = 1/k$, thus obtaining $A$ on the left hand side (make sure you understand how!), and on the right hand side a polynomial in $A^k - I$.
– darij grinberg
19 hours ago
1
I never knew you could put non-integers into binomial coefficients. Thanks for sharing!
– user25959
14 hours ago
1
@Song There is no need to do this. Since $X$ is nilpotent, the infinite sum is actually a finite sum (or more specifically, a sum of no more than $n$ terms).
– user1551
12 hours ago
1
@Song $A-I$ is nilpotent. Therefore $1$ is an eigenvalue fo $A$ of multiplicity $n$. Hence $1$ is an eigenvalue of $A^k$ of multiplicity $n$, meaning that $A^k$ is nilpotent. You may also triangularise $A$ first to see this.
– user1551
12 hours ago
|
show 4 more comments
1
I was trying to prove your hint. I was thinking that if I can show that $A^k$ has a cyclic vector then since $A^k$ commutes with $A$, $A$ would have been a polynomial in $A^k$. But it may not be the case that $A^k$ has a cyclic vector. Can you kindly give me one more hint to prove your hint ?
– Panja
yesterday
5
My favorite way of proving the hint is to recall Newton's binomial formula $left(1+xright)^r = sumlimits_{i=0}^{infty} dbinom{r}{i} x^i$ for all $r in mathbb{Q}$. This is, per se, an equality between formal power series in the indeterminate $x$ over $mathbb{Q}$. But since the matrix $A^k - I$ is nilpotent (make sure you understand why!), we can substitute $A^k - I$ for $x$ in this formula (make sure you understand why!), and apply it to $r = 1/k$, thus obtaining $A$ on the left hand side (make sure you understand how!), and on the right hand side a polynomial in $A^k - I$.
– darij grinberg
19 hours ago
1
I never knew you could put non-integers into binomial coefficients. Thanks for sharing!
– user25959
14 hours ago
1
@Song There is no need to do this. Since $X$ is nilpotent, the infinite sum is actually a finite sum (or more specifically, a sum of no more than $n$ terms).
– user1551
12 hours ago
1
@Song $A-I$ is nilpotent. Therefore $1$ is an eigenvalue fo $A$ of multiplicity $n$. Hence $1$ is an eigenvalue of $A^k$ of multiplicity $n$, meaning that $A^k$ is nilpotent. You may also triangularise $A$ first to see this.
– user1551
12 hours ago
1
1
I was trying to prove your hint. I was thinking that if I can show that $A^k$ has a cyclic vector then since $A^k$ commutes with $A$, $A$ would have been a polynomial in $A^k$. But it may not be the case that $A^k$ has a cyclic vector. Can you kindly give me one more hint to prove your hint ?
– Panja
yesterday
I was trying to prove your hint. I was thinking that if I can show that $A^k$ has a cyclic vector then since $A^k$ commutes with $A$, $A$ would have been a polynomial in $A^k$. But it may not be the case that $A^k$ has a cyclic vector. Can you kindly give me one more hint to prove your hint ?
– Panja
yesterday
5
5
My favorite way of proving the hint is to recall Newton's binomial formula $left(1+xright)^r = sumlimits_{i=0}^{infty} dbinom{r}{i} x^i$ for all $r in mathbb{Q}$. This is, per se, an equality between formal power series in the indeterminate $x$ over $mathbb{Q}$. But since the matrix $A^k - I$ is nilpotent (make sure you understand why!), we can substitute $A^k - I$ for $x$ in this formula (make sure you understand why!), and apply it to $r = 1/k$, thus obtaining $A$ on the left hand side (make sure you understand how!), and on the right hand side a polynomial in $A^k - I$.
– darij grinberg
19 hours ago
My favorite way of proving the hint is to recall Newton's binomial formula $left(1+xright)^r = sumlimits_{i=0}^{infty} dbinom{r}{i} x^i$ for all $r in mathbb{Q}$. This is, per se, an equality between formal power series in the indeterminate $x$ over $mathbb{Q}$. But since the matrix $A^k - I$ is nilpotent (make sure you understand why!), we can substitute $A^k - I$ for $x$ in this formula (make sure you understand why!), and apply it to $r = 1/k$, thus obtaining $A$ on the left hand side (make sure you understand how!), and on the right hand side a polynomial in $A^k - I$.
– darij grinberg
19 hours ago
1
1
I never knew you could put non-integers into binomial coefficients. Thanks for sharing!
– user25959
14 hours ago
I never knew you could put non-integers into binomial coefficients. Thanks for sharing!
– user25959
14 hours ago
1
1
@Song There is no need to do this. Since $X$ is nilpotent, the infinite sum is actually a finite sum (or more specifically, a sum of no more than $n$ terms).
– user1551
12 hours ago
@Song There is no need to do this. Since $X$ is nilpotent, the infinite sum is actually a finite sum (or more specifically, a sum of no more than $n$ terms).
– user1551
12 hours ago
1
1
@Song $A-I$ is nilpotent. Therefore $1$ is an eigenvalue fo $A$ of multiplicity $n$. Hence $1$ is an eigenvalue of $A^k$ of multiplicity $n$, meaning that $A^k$ is nilpotent. You may also triangularise $A$ first to see this.
– user1551
12 hours ago
@Song $A-I$ is nilpotent. Therefore $1$ is an eigenvalue fo $A$ of multiplicity $n$. Hence $1$ is an eigenvalue of $A^k$ of multiplicity $n$, meaning that $A^k$ is nilpotent. You may also triangularise $A$ first to see this.
– user1551
12 hours ago
|
show 4 more comments
up vote
0
down vote
If $A$ has only $1$ as eigenvalue then it is invertible (and of the form $I+N$ where $N$ is nilpotent with $N^n=0$).
$A $ is satisfying also any equation $(A^i-I)^n=0$ so we can write
$(A-I)^n=0, (A^2-I)^n=0, dots (A^k-I)^n=0$.
Also we can write ${A^k} B(A^k)^{-1}=B$.
In similar fashion
${A^k} B(A^k)^{-1}=({A^k})^{-1} B(A^k)$
${A^{2k}} B = B(A^{2k})$, etc..
For any natural $m$:
${A^{mk}} B = B(A^{mk})$
If powers $(A^k)^m$ could be a basis for expression of $A$ then commutativity would follow...
add a comment |
up vote
0
down vote
If $A$ has only $1$ as eigenvalue then it is invertible (and of the form $I+N$ where $N$ is nilpotent with $N^n=0$).
$A $ is satisfying also any equation $(A^i-I)^n=0$ so we can write
$(A-I)^n=0, (A^2-I)^n=0, dots (A^k-I)^n=0$.
Also we can write ${A^k} B(A^k)^{-1}=B$.
In similar fashion
${A^k} B(A^k)^{-1}=({A^k})^{-1} B(A^k)$
${A^{2k}} B = B(A^{2k})$, etc..
For any natural $m$:
${A^{mk}} B = B(A^{mk})$
If powers $(A^k)^m$ could be a basis for expression of $A$ then commutativity would follow...
add a comment |
up vote
0
down vote
up vote
0
down vote
If $A$ has only $1$ as eigenvalue then it is invertible (and of the form $I+N$ where $N$ is nilpotent with $N^n=0$).
$A $ is satisfying also any equation $(A^i-I)^n=0$ so we can write
$(A-I)^n=0, (A^2-I)^n=0, dots (A^k-I)^n=0$.
Also we can write ${A^k} B(A^k)^{-1}=B$.
In similar fashion
${A^k} B(A^k)^{-1}=({A^k})^{-1} B(A^k)$
${A^{2k}} B = B(A^{2k})$, etc..
For any natural $m$:
${A^{mk}} B = B(A^{mk})$
If powers $(A^k)^m$ could be a basis for expression of $A$ then commutativity would follow...
If $A$ has only $1$ as eigenvalue then it is invertible (and of the form $I+N$ where $N$ is nilpotent with $N^n=0$).
$A $ is satisfying also any equation $(A^i-I)^n=0$ so we can write
$(A-I)^n=0, (A^2-I)^n=0, dots (A^k-I)^n=0$.
Also we can write ${A^k} B(A^k)^{-1}=B$.
In similar fashion
${A^k} B(A^k)^{-1}=({A^k})^{-1} B(A^k)$
${A^{2k}} B = B(A^{2k})$, etc..
For any natural $m$:
${A^{mk}} B = B(A^{mk})$
If powers $(A^k)^m$ could be a basis for expression of $A$ then commutativity would follow...
edited 23 hours ago
answered yesterday
Widawensen
4,34921444
4,34921444
add a comment |
add a comment |
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3004924%2fif-ak-commutes-with-b-then-a-commutes-with-b%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
2
Note that $mathbb{N}$ has to mean $left{1,2,3,ldotsright}$ for this to be correct.
– darij grinberg
19 hours ago