If $A^k$ commutes with $B$ then $A$ commutes with $B$.











up vote
10
down vote

favorite
8












Let $A$ and $B$ are two $n times n$ Complex matrices. Assume that $(A-I)^n=0$ and $A^kB=BA^k$ for some $k in mathbb{N}$. Then I want to prove that $AB= BA$.



Clearly $1$ is the only eigen value of $A$ and also $A^k$ and $B$ are simultaneously triangulable. But how do I get down to $A$ to commute with $B$. Any help will be appreciated. Thanks.










share|cite|improve this question


















  • 2




    Note that $mathbb{N}$ has to mean $left{1,2,3,ldotsright}$ for this to be correct.
    – darij grinberg
    19 hours ago















up vote
10
down vote

favorite
8












Let $A$ and $B$ are two $n times n$ Complex matrices. Assume that $(A-I)^n=0$ and $A^kB=BA^k$ for some $k in mathbb{N}$. Then I want to prove that $AB= BA$.



Clearly $1$ is the only eigen value of $A$ and also $A^k$ and $B$ are simultaneously triangulable. But how do I get down to $A$ to commute with $B$. Any help will be appreciated. Thanks.










share|cite|improve this question


















  • 2




    Note that $mathbb{N}$ has to mean $left{1,2,3,ldotsright}$ for this to be correct.
    – darij grinberg
    19 hours ago













up vote
10
down vote

favorite
8









up vote
10
down vote

favorite
8






8





Let $A$ and $B$ are two $n times n$ Complex matrices. Assume that $(A-I)^n=0$ and $A^kB=BA^k$ for some $k in mathbb{N}$. Then I want to prove that $AB= BA$.



Clearly $1$ is the only eigen value of $A$ and also $A^k$ and $B$ are simultaneously triangulable. But how do I get down to $A$ to commute with $B$. Any help will be appreciated. Thanks.










share|cite|improve this question













Let $A$ and $B$ are two $n times n$ Complex matrices. Assume that $(A-I)^n=0$ and $A^kB=BA^k$ for some $k in mathbb{N}$. Then I want to prove that $AB= BA$.



Clearly $1$ is the only eigen value of $A$ and also $A^k$ and $B$ are simultaneously triangulable. But how do I get down to $A$ to commute with $B$. Any help will be appreciated. Thanks.







linear-algebra abstract-algebra matrices field-theory






share|cite|improve this question













share|cite|improve this question











share|cite|improve this question




share|cite|improve this question










asked yesterday









Panja

286213




286213








  • 2




    Note that $mathbb{N}$ has to mean $left{1,2,3,ldotsright}$ for this to be correct.
    – darij grinberg
    19 hours ago














  • 2




    Note that $mathbb{N}$ has to mean $left{1,2,3,ldotsright}$ for this to be correct.
    – darij grinberg
    19 hours ago








2




2




Note that $mathbb{N}$ has to mean $left{1,2,3,ldotsright}$ for this to be correct.
– darij grinberg
19 hours ago




Note that $mathbb{N}$ has to mean $left{1,2,3,ldotsright}$ for this to be correct.
– darij grinberg
19 hours ago










2 Answers
2






active

oldest

votes

















up vote
8
down vote



accepted










Hint. Try to prove that $A$ is a polynomial in $A^k$.



Edit. To prove the hint, you may follow darij grinberg's comment below. I did essentially the same thing, but from a matrix analytic rather than linear algebraic perspective: I considered the primary matrix function $f(X)=(I+X)^{1/k}=sum_{i=0}^infty frac{f^{(i)}(0)}{k!}X^i$ for a nilpotent matrix $X$ associated with the scalar function $f(x)=(1+x)^{1/k}$. Put $X=A^k-I$ and we are done.



Alternatively, the hint can be proved using only Jordan forms, but the argument is much longer.




  1. Let $J$ be the Jordan form of $A$. Since all eigenvalues of $A$ are ones, we may write $J=J_{m_1}oplus J_{m_2}opluscdotsoplus J_{m_b}$, where $1le m_1le m_2lecdotsle m_b$ and $J_m$ denotes a Jordan block of size $m$ for the eigenvalue $1$.

  2. Note that if $rge0$ and $m<n$, then $J_m^r$ coincides with the leading principal $mtimes m$ submatrix of $J_n^r$. Hence $p(J_m^k)$ coincides with a leading principal submatrix of $p(J_n^k)$ for any polynomial $p$.

  3. It follows that if $p(J_n^k)=J_n$, then $p(J_m^k)=J_m$ for every $m<n$. This is true in particular when $min{m_1,m_2,ldots,m_b}$. Consequently, $p(A^k)=A$.

  4. Hence the problem boils down to finding a polynomial $p$ such that $p(J_n^k)=J_n$. This should be straightforward and I will leave it to you.






share|cite|improve this answer



















  • 1




    I was trying to prove your hint. I was thinking that if I can show that $A^k$ has a cyclic vector then since $A^k$ commutes with $A$, $A$ would have been a polynomial in $A^k$. But it may not be the case that $A^k$ has a cyclic vector. Can you kindly give me one more hint to prove your hint ?
    – Panja
    yesterday








  • 5




    My favorite way of proving the hint is to recall Newton's binomial formula $left(1+xright)^r = sumlimits_{i=0}^{infty} dbinom{r}{i} x^i$ for all $r in mathbb{Q}$. This is, per se, an equality between formal power series in the indeterminate $x$ over $mathbb{Q}$. But since the matrix $A^k - I$ is nilpotent (make sure you understand why!), we can substitute $A^k - I$ for $x$ in this formula (make sure you understand why!), and apply it to $r = 1/k$, thus obtaining $A$ on the left hand side (make sure you understand how!), and on the right hand side a polynomial in $A^k - I$.
    – darij grinberg
    19 hours ago






  • 1




    I never knew you could put non-integers into binomial coefficients. Thanks for sharing!
    – user25959
    14 hours ago






  • 1




    @Song There is no need to do this. Since $X$ is nilpotent, the infinite sum is actually a finite sum (or more specifically, a sum of no more than $n$ terms).
    – user1551
    12 hours ago






  • 1




    @Song $A-I$ is nilpotent. Therefore $1$ is an eigenvalue fo $A$ of multiplicity $n$. Hence $1$ is an eigenvalue of $A^k$ of multiplicity $n$, meaning that $A^k$ is nilpotent. You may also triangularise $A$ first to see this.
    – user1551
    12 hours ago




















up vote
0
down vote













If $A$ has only $1$ as eigenvalue then it is invertible (and of the form $I+N$ where $N$ is nilpotent with $N^n=0$).



$A $ is satisfying also any equation $(A^i-I)^n=0$ so we can write



$(A-I)^n=0, (A^2-I)^n=0, dots (A^k-I)^n=0$.



Also we can write ${A^k} B(A^k)^{-1}=B$.

In similar fashion



${A^k} B(A^k)^{-1}=({A^k})^{-1} B(A^k)$



${A^{2k}} B = B(A^{2k})$, etc..



For any natural $m$:



${A^{mk}} B = B(A^{mk})$



If powers $(A^k)^m$ could be a basis for expression of $A$ then commutativity would follow...






share|cite|improve this answer























    Your Answer





    StackExchange.ifUsing("editor", function () {
    return StackExchange.using("mathjaxEditing", function () {
    StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
    StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
    });
    });
    }, "mathjax-editing");

    StackExchange.ready(function() {
    var channelOptions = {
    tags: "".split(" "),
    id: "69"
    };
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function() {
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled) {
    StackExchange.using("snippets", function() {
    createEditor();
    });
    }
    else {
    createEditor();
    }
    });

    function createEditor() {
    StackExchange.prepareEditor({
    heartbeatType: 'answer',
    convertImagesToLinks: true,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    imageUploader: {
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    },
    noCode: true, onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    });


    }
    });














     

    draft saved


    draft discarded


















    StackExchange.ready(
    function () {
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3004924%2fif-ak-commutes-with-b-then-a-commutes-with-b%23new-answer', 'question_page');
    }
    );

    Post as a guest















    Required, but never shown

























    2 Answers
    2






    active

    oldest

    votes








    2 Answers
    2






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes








    up vote
    8
    down vote



    accepted










    Hint. Try to prove that $A$ is a polynomial in $A^k$.



    Edit. To prove the hint, you may follow darij grinberg's comment below. I did essentially the same thing, but from a matrix analytic rather than linear algebraic perspective: I considered the primary matrix function $f(X)=(I+X)^{1/k}=sum_{i=0}^infty frac{f^{(i)}(0)}{k!}X^i$ for a nilpotent matrix $X$ associated with the scalar function $f(x)=(1+x)^{1/k}$. Put $X=A^k-I$ and we are done.



    Alternatively, the hint can be proved using only Jordan forms, but the argument is much longer.




    1. Let $J$ be the Jordan form of $A$. Since all eigenvalues of $A$ are ones, we may write $J=J_{m_1}oplus J_{m_2}opluscdotsoplus J_{m_b}$, where $1le m_1le m_2lecdotsle m_b$ and $J_m$ denotes a Jordan block of size $m$ for the eigenvalue $1$.

    2. Note that if $rge0$ and $m<n$, then $J_m^r$ coincides with the leading principal $mtimes m$ submatrix of $J_n^r$. Hence $p(J_m^k)$ coincides with a leading principal submatrix of $p(J_n^k)$ for any polynomial $p$.

    3. It follows that if $p(J_n^k)=J_n$, then $p(J_m^k)=J_m$ for every $m<n$. This is true in particular when $min{m_1,m_2,ldots,m_b}$. Consequently, $p(A^k)=A$.

    4. Hence the problem boils down to finding a polynomial $p$ such that $p(J_n^k)=J_n$. This should be straightforward and I will leave it to you.






    share|cite|improve this answer



















    • 1




      I was trying to prove your hint. I was thinking that if I can show that $A^k$ has a cyclic vector then since $A^k$ commutes with $A$, $A$ would have been a polynomial in $A^k$. But it may not be the case that $A^k$ has a cyclic vector. Can you kindly give me one more hint to prove your hint ?
      – Panja
      yesterday








    • 5




      My favorite way of proving the hint is to recall Newton's binomial formula $left(1+xright)^r = sumlimits_{i=0}^{infty} dbinom{r}{i} x^i$ for all $r in mathbb{Q}$. This is, per se, an equality between formal power series in the indeterminate $x$ over $mathbb{Q}$. But since the matrix $A^k - I$ is nilpotent (make sure you understand why!), we can substitute $A^k - I$ for $x$ in this formula (make sure you understand why!), and apply it to $r = 1/k$, thus obtaining $A$ on the left hand side (make sure you understand how!), and on the right hand side a polynomial in $A^k - I$.
      – darij grinberg
      19 hours ago






    • 1




      I never knew you could put non-integers into binomial coefficients. Thanks for sharing!
      – user25959
      14 hours ago






    • 1




      @Song There is no need to do this. Since $X$ is nilpotent, the infinite sum is actually a finite sum (or more specifically, a sum of no more than $n$ terms).
      – user1551
      12 hours ago






    • 1




      @Song $A-I$ is nilpotent. Therefore $1$ is an eigenvalue fo $A$ of multiplicity $n$. Hence $1$ is an eigenvalue of $A^k$ of multiplicity $n$, meaning that $A^k$ is nilpotent. You may also triangularise $A$ first to see this.
      – user1551
      12 hours ago

















    up vote
    8
    down vote



    accepted










    Hint. Try to prove that $A$ is a polynomial in $A^k$.



    Edit. To prove the hint, you may follow darij grinberg's comment below. I did essentially the same thing, but from a matrix analytic rather than linear algebraic perspective: I considered the primary matrix function $f(X)=(I+X)^{1/k}=sum_{i=0}^infty frac{f^{(i)}(0)}{k!}X^i$ for a nilpotent matrix $X$ associated with the scalar function $f(x)=(1+x)^{1/k}$. Put $X=A^k-I$ and we are done.



    Alternatively, the hint can be proved using only Jordan forms, but the argument is much longer.




    1. Let $J$ be the Jordan form of $A$. Since all eigenvalues of $A$ are ones, we may write $J=J_{m_1}oplus J_{m_2}opluscdotsoplus J_{m_b}$, where $1le m_1le m_2lecdotsle m_b$ and $J_m$ denotes a Jordan block of size $m$ for the eigenvalue $1$.

    2. Note that if $rge0$ and $m<n$, then $J_m^r$ coincides with the leading principal $mtimes m$ submatrix of $J_n^r$. Hence $p(J_m^k)$ coincides with a leading principal submatrix of $p(J_n^k)$ for any polynomial $p$.

    3. It follows that if $p(J_n^k)=J_n$, then $p(J_m^k)=J_m$ for every $m<n$. This is true in particular when $min{m_1,m_2,ldots,m_b}$. Consequently, $p(A^k)=A$.

    4. Hence the problem boils down to finding a polynomial $p$ such that $p(J_n^k)=J_n$. This should be straightforward and I will leave it to you.






    share|cite|improve this answer



















    • 1




      I was trying to prove your hint. I was thinking that if I can show that $A^k$ has a cyclic vector then since $A^k$ commutes with $A$, $A$ would have been a polynomial in $A^k$. But it may not be the case that $A^k$ has a cyclic vector. Can you kindly give me one more hint to prove your hint ?
      – Panja
      yesterday








    • 5




      My favorite way of proving the hint is to recall Newton's binomial formula $left(1+xright)^r = sumlimits_{i=0}^{infty} dbinom{r}{i} x^i$ for all $r in mathbb{Q}$. This is, per se, an equality between formal power series in the indeterminate $x$ over $mathbb{Q}$. But since the matrix $A^k - I$ is nilpotent (make sure you understand why!), we can substitute $A^k - I$ for $x$ in this formula (make sure you understand why!), and apply it to $r = 1/k$, thus obtaining $A$ on the left hand side (make sure you understand how!), and on the right hand side a polynomial in $A^k - I$.
      – darij grinberg
      19 hours ago






    • 1




      I never knew you could put non-integers into binomial coefficients. Thanks for sharing!
      – user25959
      14 hours ago






    • 1




      @Song There is no need to do this. Since $X$ is nilpotent, the infinite sum is actually a finite sum (or more specifically, a sum of no more than $n$ terms).
      – user1551
      12 hours ago






    • 1




      @Song $A-I$ is nilpotent. Therefore $1$ is an eigenvalue fo $A$ of multiplicity $n$. Hence $1$ is an eigenvalue of $A^k$ of multiplicity $n$, meaning that $A^k$ is nilpotent. You may also triangularise $A$ first to see this.
      – user1551
      12 hours ago















    up vote
    8
    down vote



    accepted







    up vote
    8
    down vote



    accepted






    Hint. Try to prove that $A$ is a polynomial in $A^k$.



    Edit. To prove the hint, you may follow darij grinberg's comment below. I did essentially the same thing, but from a matrix analytic rather than linear algebraic perspective: I considered the primary matrix function $f(X)=(I+X)^{1/k}=sum_{i=0}^infty frac{f^{(i)}(0)}{k!}X^i$ for a nilpotent matrix $X$ associated with the scalar function $f(x)=(1+x)^{1/k}$. Put $X=A^k-I$ and we are done.



    Alternatively, the hint can be proved using only Jordan forms, but the argument is much longer.




    1. Let $J$ be the Jordan form of $A$. Since all eigenvalues of $A$ are ones, we may write $J=J_{m_1}oplus J_{m_2}opluscdotsoplus J_{m_b}$, where $1le m_1le m_2lecdotsle m_b$ and $J_m$ denotes a Jordan block of size $m$ for the eigenvalue $1$.

    2. Note that if $rge0$ and $m<n$, then $J_m^r$ coincides with the leading principal $mtimes m$ submatrix of $J_n^r$. Hence $p(J_m^k)$ coincides with a leading principal submatrix of $p(J_n^k)$ for any polynomial $p$.

    3. It follows that if $p(J_n^k)=J_n$, then $p(J_m^k)=J_m$ for every $m<n$. This is true in particular when $min{m_1,m_2,ldots,m_b}$. Consequently, $p(A^k)=A$.

    4. Hence the problem boils down to finding a polynomial $p$ such that $p(J_n^k)=J_n$. This should be straightforward and I will leave it to you.






    share|cite|improve this answer














    Hint. Try to prove that $A$ is a polynomial in $A^k$.



    Edit. To prove the hint, you may follow darij grinberg's comment below. I did essentially the same thing, but from a matrix analytic rather than linear algebraic perspective: I considered the primary matrix function $f(X)=(I+X)^{1/k}=sum_{i=0}^infty frac{f^{(i)}(0)}{k!}X^i$ for a nilpotent matrix $X$ associated with the scalar function $f(x)=(1+x)^{1/k}$. Put $X=A^k-I$ and we are done.



    Alternatively, the hint can be proved using only Jordan forms, but the argument is much longer.




    1. Let $J$ be the Jordan form of $A$. Since all eigenvalues of $A$ are ones, we may write $J=J_{m_1}oplus J_{m_2}opluscdotsoplus J_{m_b}$, where $1le m_1le m_2lecdotsle m_b$ and $J_m$ denotes a Jordan block of size $m$ for the eigenvalue $1$.

    2. Note that if $rge0$ and $m<n$, then $J_m^r$ coincides with the leading principal $mtimes m$ submatrix of $J_n^r$. Hence $p(J_m^k)$ coincides with a leading principal submatrix of $p(J_n^k)$ for any polynomial $p$.

    3. It follows that if $p(J_n^k)=J_n$, then $p(J_m^k)=J_m$ for every $m<n$. This is true in particular when $min{m_1,m_2,ldots,m_b}$. Consequently, $p(A^k)=A$.

    4. Hence the problem boils down to finding a polynomial $p$ such that $p(J_n^k)=J_n$. This should be straightforward and I will leave it to you.







    share|cite|improve this answer














    share|cite|improve this answer



    share|cite|improve this answer








    edited 12 hours ago

























    answered yesterday









    user1551

    70.1k566125




    70.1k566125








    • 1




      I was trying to prove your hint. I was thinking that if I can show that $A^k$ has a cyclic vector then since $A^k$ commutes with $A$, $A$ would have been a polynomial in $A^k$. But it may not be the case that $A^k$ has a cyclic vector. Can you kindly give me one more hint to prove your hint ?
      – Panja
      yesterday








    • 5




      My favorite way of proving the hint is to recall Newton's binomial formula $left(1+xright)^r = sumlimits_{i=0}^{infty} dbinom{r}{i} x^i$ for all $r in mathbb{Q}$. This is, per se, an equality between formal power series in the indeterminate $x$ over $mathbb{Q}$. But since the matrix $A^k - I$ is nilpotent (make sure you understand why!), we can substitute $A^k - I$ for $x$ in this formula (make sure you understand why!), and apply it to $r = 1/k$, thus obtaining $A$ on the left hand side (make sure you understand how!), and on the right hand side a polynomial in $A^k - I$.
      – darij grinberg
      19 hours ago






    • 1




      I never knew you could put non-integers into binomial coefficients. Thanks for sharing!
      – user25959
      14 hours ago






    • 1




      @Song There is no need to do this. Since $X$ is nilpotent, the infinite sum is actually a finite sum (or more specifically, a sum of no more than $n$ terms).
      – user1551
      12 hours ago






    • 1




      @Song $A-I$ is nilpotent. Therefore $1$ is an eigenvalue fo $A$ of multiplicity $n$. Hence $1$ is an eigenvalue of $A^k$ of multiplicity $n$, meaning that $A^k$ is nilpotent. You may also triangularise $A$ first to see this.
      – user1551
      12 hours ago
















    • 1




      I was trying to prove your hint. I was thinking that if I can show that $A^k$ has a cyclic vector then since $A^k$ commutes with $A$, $A$ would have been a polynomial in $A^k$. But it may not be the case that $A^k$ has a cyclic vector. Can you kindly give me one more hint to prove your hint ?
      – Panja
      yesterday








    • 5




      My favorite way of proving the hint is to recall Newton's binomial formula $left(1+xright)^r = sumlimits_{i=0}^{infty} dbinom{r}{i} x^i$ for all $r in mathbb{Q}$. This is, per se, an equality between formal power series in the indeterminate $x$ over $mathbb{Q}$. But since the matrix $A^k - I$ is nilpotent (make sure you understand why!), we can substitute $A^k - I$ for $x$ in this formula (make sure you understand why!), and apply it to $r = 1/k$, thus obtaining $A$ on the left hand side (make sure you understand how!), and on the right hand side a polynomial in $A^k - I$.
      – darij grinberg
      19 hours ago






    • 1




      I never knew you could put non-integers into binomial coefficients. Thanks for sharing!
      – user25959
      14 hours ago






    • 1




      @Song There is no need to do this. Since $X$ is nilpotent, the infinite sum is actually a finite sum (or more specifically, a sum of no more than $n$ terms).
      – user1551
      12 hours ago






    • 1




      @Song $A-I$ is nilpotent. Therefore $1$ is an eigenvalue fo $A$ of multiplicity $n$. Hence $1$ is an eigenvalue of $A^k$ of multiplicity $n$, meaning that $A^k$ is nilpotent. You may also triangularise $A$ first to see this.
      – user1551
      12 hours ago










    1




    1




    I was trying to prove your hint. I was thinking that if I can show that $A^k$ has a cyclic vector then since $A^k$ commutes with $A$, $A$ would have been a polynomial in $A^k$. But it may not be the case that $A^k$ has a cyclic vector. Can you kindly give me one more hint to prove your hint ?
    – Panja
    yesterday






    I was trying to prove your hint. I was thinking that if I can show that $A^k$ has a cyclic vector then since $A^k$ commutes with $A$, $A$ would have been a polynomial in $A^k$. But it may not be the case that $A^k$ has a cyclic vector. Can you kindly give me one more hint to prove your hint ?
    – Panja
    yesterday






    5




    5




    My favorite way of proving the hint is to recall Newton's binomial formula $left(1+xright)^r = sumlimits_{i=0}^{infty} dbinom{r}{i} x^i$ for all $r in mathbb{Q}$. This is, per se, an equality between formal power series in the indeterminate $x$ over $mathbb{Q}$. But since the matrix $A^k - I$ is nilpotent (make sure you understand why!), we can substitute $A^k - I$ for $x$ in this formula (make sure you understand why!), and apply it to $r = 1/k$, thus obtaining $A$ on the left hand side (make sure you understand how!), and on the right hand side a polynomial in $A^k - I$.
    – darij grinberg
    19 hours ago




    My favorite way of proving the hint is to recall Newton's binomial formula $left(1+xright)^r = sumlimits_{i=0}^{infty} dbinom{r}{i} x^i$ for all $r in mathbb{Q}$. This is, per se, an equality between formal power series in the indeterminate $x$ over $mathbb{Q}$. But since the matrix $A^k - I$ is nilpotent (make sure you understand why!), we can substitute $A^k - I$ for $x$ in this formula (make sure you understand why!), and apply it to $r = 1/k$, thus obtaining $A$ on the left hand side (make sure you understand how!), and on the right hand side a polynomial in $A^k - I$.
    – darij grinberg
    19 hours ago




    1




    1




    I never knew you could put non-integers into binomial coefficients. Thanks for sharing!
    – user25959
    14 hours ago




    I never knew you could put non-integers into binomial coefficients. Thanks for sharing!
    – user25959
    14 hours ago




    1




    1




    @Song There is no need to do this. Since $X$ is nilpotent, the infinite sum is actually a finite sum (or more specifically, a sum of no more than $n$ terms).
    – user1551
    12 hours ago




    @Song There is no need to do this. Since $X$ is nilpotent, the infinite sum is actually a finite sum (or more specifically, a sum of no more than $n$ terms).
    – user1551
    12 hours ago




    1




    1




    @Song $A-I$ is nilpotent. Therefore $1$ is an eigenvalue fo $A$ of multiplicity $n$. Hence $1$ is an eigenvalue of $A^k$ of multiplicity $n$, meaning that $A^k$ is nilpotent. You may also triangularise $A$ first to see this.
    – user1551
    12 hours ago






    @Song $A-I$ is nilpotent. Therefore $1$ is an eigenvalue fo $A$ of multiplicity $n$. Hence $1$ is an eigenvalue of $A^k$ of multiplicity $n$, meaning that $A^k$ is nilpotent. You may also triangularise $A$ first to see this.
    – user1551
    12 hours ago












    up vote
    0
    down vote













    If $A$ has only $1$ as eigenvalue then it is invertible (and of the form $I+N$ where $N$ is nilpotent with $N^n=0$).



    $A $ is satisfying also any equation $(A^i-I)^n=0$ so we can write



    $(A-I)^n=0, (A^2-I)^n=0, dots (A^k-I)^n=0$.



    Also we can write ${A^k} B(A^k)^{-1}=B$.

    In similar fashion



    ${A^k} B(A^k)^{-1}=({A^k})^{-1} B(A^k)$



    ${A^{2k}} B = B(A^{2k})$, etc..



    For any natural $m$:



    ${A^{mk}} B = B(A^{mk})$



    If powers $(A^k)^m$ could be a basis for expression of $A$ then commutativity would follow...






    share|cite|improve this answer



























      up vote
      0
      down vote













      If $A$ has only $1$ as eigenvalue then it is invertible (and of the form $I+N$ where $N$ is nilpotent with $N^n=0$).



      $A $ is satisfying also any equation $(A^i-I)^n=0$ so we can write



      $(A-I)^n=0, (A^2-I)^n=0, dots (A^k-I)^n=0$.



      Also we can write ${A^k} B(A^k)^{-1}=B$.

      In similar fashion



      ${A^k} B(A^k)^{-1}=({A^k})^{-1} B(A^k)$



      ${A^{2k}} B = B(A^{2k})$, etc..



      For any natural $m$:



      ${A^{mk}} B = B(A^{mk})$



      If powers $(A^k)^m$ could be a basis for expression of $A$ then commutativity would follow...






      share|cite|improve this answer

























        up vote
        0
        down vote










        up vote
        0
        down vote









        If $A$ has only $1$ as eigenvalue then it is invertible (and of the form $I+N$ where $N$ is nilpotent with $N^n=0$).



        $A $ is satisfying also any equation $(A^i-I)^n=0$ so we can write



        $(A-I)^n=0, (A^2-I)^n=0, dots (A^k-I)^n=0$.



        Also we can write ${A^k} B(A^k)^{-1}=B$.

        In similar fashion



        ${A^k} B(A^k)^{-1}=({A^k})^{-1} B(A^k)$



        ${A^{2k}} B = B(A^{2k})$, etc..



        For any natural $m$:



        ${A^{mk}} B = B(A^{mk})$



        If powers $(A^k)^m$ could be a basis for expression of $A$ then commutativity would follow...






        share|cite|improve this answer














        If $A$ has only $1$ as eigenvalue then it is invertible (and of the form $I+N$ where $N$ is nilpotent with $N^n=0$).



        $A $ is satisfying also any equation $(A^i-I)^n=0$ so we can write



        $(A-I)^n=0, (A^2-I)^n=0, dots (A^k-I)^n=0$.



        Also we can write ${A^k} B(A^k)^{-1}=B$.

        In similar fashion



        ${A^k} B(A^k)^{-1}=({A^k})^{-1} B(A^k)$



        ${A^{2k}} B = B(A^{2k})$, etc..



        For any natural $m$:



        ${A^{mk}} B = B(A^{mk})$



        If powers $(A^k)^m$ could be a basis for expression of $A$ then commutativity would follow...







        share|cite|improve this answer














        share|cite|improve this answer



        share|cite|improve this answer








        edited 23 hours ago

























        answered yesterday









        Widawensen

        4,34921444




        4,34921444






























             

            draft saved


            draft discarded



















































             


            draft saved


            draft discarded














            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3004924%2fif-ak-commutes-with-b-then-a-commutes-with-b%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            'app-layout' is not a known element: how to share Component with different Modules

            android studio warns about leanback feature tag usage required on manifest while using Unity exported app?

            WPF add header to Image with URL pettitions [duplicate]