The Lie bracket of $mathfrak{gl}_n(mathbb{R})$ is the matrix commutator
$begingroup$
Notation/preliminaries.
Let $mathfrak{g}$ denote the Lie algebra (of left-invariant vector fields) on the Lie group $G$. Its Lie bracket $[.,.]colon mathfrak{g}timesmathfrak{g}tomathfrak{g}$ is defined by
$$[X,Y]_p(f)=X_p(Y_{square}(f))-Y_p(X_square(f))$$
for any vector fields $X,Yin C^{infty}(TG)$, any point $pin G$ and any smooth function $fin C^{infty}(G)$. Here, $X_square(f)colon Gtomathbb{R}$ is the smooth map defined by $qmapsto X_q(f)$.Let $T_eG$ denote the tangent space at the identify element $e$, consisting of all linear maps $C^{infty}(G)to mathbb{R}$ which satisfy the product rule. The tangent space is equipped with the Lie bracket $[![.,.]!]colon T_eGtimes T_eGto T_eG$ given by $[![X_e,Y_e]!]=[X,Y]_e$.
This gives us a Lie algebra isomorphism $mathfrak{g}cong T_eG$. More precisely, one can show that every tangent vector $X_ein T_eG$ can be extended in a unique way to a left-invariant vector field $Xin mathfrak{g}$.
For the Lie group $G=mathrm{GL}_n(mathbb{R})$, we can use $xcolon mathrm{GL}_n(mathbb{R})to mathbb{R}^{ntimes n}$ defined by $pmapsto p$ as global coordinates.
We will use $Big{Big(frac{partial}{partial x^{ij}}Big)_eBig}_{i,j=1}^n$ as a basis for $T_emathrm{GL}_n(mathbb{R})$. Here, $Big(frac{partial}{partial x^{ij}}Big)_e(f)=partial_{ij}(fcirc x^{-1})vert_{x(e)}$ for everh smooth function $fcolon mathrm{GL}_n(mathbb{R})tomathbb{R}$.
This gives rise to a vector space isomorphism $T_emathrm{GL}_n(mathbb{R})cong mathbb{R}^{ntimes n}$ via
$sum_{i,j} a_{ij}Big(frac{partial}{partial x^{ij}}Big)_emapsto (a_{ij})$.For any vector field $X$ on $mathrm{GL}_n(mathbb{R})$, we let $M_X$ denote the matrix associated to $X$ via the identifications $mathfrak{gl}_n(mathbb{R})cong T_emathrm{GL}_n(mathbb{R})cong mathbb{R}^{ntimes n}$.
Problem. I want to show that under the identifications $mathfrak{gl}_n(mathbb{R})cong T_emathrm{GL}_n(mathbb{R})cong mathbb{R}^{ntimes n}$, $[.,.]$ corresponds to the matrix commutator on $mathbb{R}^{ntimes n}$. Or more precisely:
For any vector fields $X$ and $Y$ on $mathrm{GL}_n(mathbb{R})$, it holds that
$M_{[X,Y]}=M_XM_Y-M_YM_X$.
Own attempt. I have realized that it suffices to show that for two tangent vectors $X_e=sum_{i,j} a_{ij} Big(frac{partial}{partial x^{ij}}Big)_e$ and $Y_e=sum_{i,j} b_{ij} Big(frac{partial}{partial x^{ij}}Big)_e$, it holds that
$$[X,Y]_e=sum_{i,j,k} big(a_{ik}b_{kj}-b_{ik}a_{kj}big)Big(frac{partial}{partial x^{ij}}Big)_e.$$
The first step, I guess, is to find the extensions $X$ and $Y$ of $X_e$ and $Y_e$, respectively. I'm more or less convinced that for any $p=(p_{ij})in mathrm{GL}_n(mathbb{R})$, it holds that
$$X_p=sum_{i,j,k} p_{ik}a_{kj}Big(frac{partial}{partial x^{ij}}Big)_p.$$ Can we conclude from this that
$X=sum_{i,j,k} x_{ik}(square)a_{kj}Big(frac{partial}{partial x^{ij}}Big)_square$
holds?
I've then tried to compute $[X,Y]_e(f)$ for an arbtirary $fin C^infty(mathrm{GL}_n(mathbb{R}))$, using the formula in (1) above, and end up with the scary expression
$$[X,Y]_e(f)=Bigg(sum_{i,j} a_{ij} Big(frac{partial}{partial x^{ij}}Big)_eBigg)Bigg(Big(sum_{i,j,k} x_{ik}(square)b_{kj}Big(frac{partial}{partial x^{ij}}Big)_squareBig)(f)Bigg)-Bigg(sum_{i,j} b_{ij} Big(frac{partial}{partial x^{ij}}Big)_eBigg)Bigg(Big(sum_{i,j,k} x_{ik}(square)a_{kj}Big(frac{partial}{partial x^{ij}}Big)_squareBig)(f)Bigg),,$$
from which I have no idea where to go. Am I at all on the right track here? It feels like my main problem is that I get a little bit lost in all the notation and all identifications we make back and forth. Indeed, proofs of this fact can be found in many text books (e.g. Lee's Introduction to Smooth Manifolds p. 194), but the notation there tends to be too coarse for me to follow what is going on.
differential-geometry lie-groups lie-algebras smooth-manifolds
$endgroup$
add a comment |
$begingroup$
Notation/preliminaries.
Let $mathfrak{g}$ denote the Lie algebra (of left-invariant vector fields) on the Lie group $G$. Its Lie bracket $[.,.]colon mathfrak{g}timesmathfrak{g}tomathfrak{g}$ is defined by
$$[X,Y]_p(f)=X_p(Y_{square}(f))-Y_p(X_square(f))$$
for any vector fields $X,Yin C^{infty}(TG)$, any point $pin G$ and any smooth function $fin C^{infty}(G)$. Here, $X_square(f)colon Gtomathbb{R}$ is the smooth map defined by $qmapsto X_q(f)$.Let $T_eG$ denote the tangent space at the identify element $e$, consisting of all linear maps $C^{infty}(G)to mathbb{R}$ which satisfy the product rule. The tangent space is equipped with the Lie bracket $[![.,.]!]colon T_eGtimes T_eGto T_eG$ given by $[![X_e,Y_e]!]=[X,Y]_e$.
This gives us a Lie algebra isomorphism $mathfrak{g}cong T_eG$. More precisely, one can show that every tangent vector $X_ein T_eG$ can be extended in a unique way to a left-invariant vector field $Xin mathfrak{g}$.
For the Lie group $G=mathrm{GL}_n(mathbb{R})$, we can use $xcolon mathrm{GL}_n(mathbb{R})to mathbb{R}^{ntimes n}$ defined by $pmapsto p$ as global coordinates.
We will use $Big{Big(frac{partial}{partial x^{ij}}Big)_eBig}_{i,j=1}^n$ as a basis for $T_emathrm{GL}_n(mathbb{R})$. Here, $Big(frac{partial}{partial x^{ij}}Big)_e(f)=partial_{ij}(fcirc x^{-1})vert_{x(e)}$ for everh smooth function $fcolon mathrm{GL}_n(mathbb{R})tomathbb{R}$.
This gives rise to a vector space isomorphism $T_emathrm{GL}_n(mathbb{R})cong mathbb{R}^{ntimes n}$ via
$sum_{i,j} a_{ij}Big(frac{partial}{partial x^{ij}}Big)_emapsto (a_{ij})$.For any vector field $X$ on $mathrm{GL}_n(mathbb{R})$, we let $M_X$ denote the matrix associated to $X$ via the identifications $mathfrak{gl}_n(mathbb{R})cong T_emathrm{GL}_n(mathbb{R})cong mathbb{R}^{ntimes n}$.
Problem. I want to show that under the identifications $mathfrak{gl}_n(mathbb{R})cong T_emathrm{GL}_n(mathbb{R})cong mathbb{R}^{ntimes n}$, $[.,.]$ corresponds to the matrix commutator on $mathbb{R}^{ntimes n}$. Or more precisely:
For any vector fields $X$ and $Y$ on $mathrm{GL}_n(mathbb{R})$, it holds that
$M_{[X,Y]}=M_XM_Y-M_YM_X$.
Own attempt. I have realized that it suffices to show that for two tangent vectors $X_e=sum_{i,j} a_{ij} Big(frac{partial}{partial x^{ij}}Big)_e$ and $Y_e=sum_{i,j} b_{ij} Big(frac{partial}{partial x^{ij}}Big)_e$, it holds that
$$[X,Y]_e=sum_{i,j,k} big(a_{ik}b_{kj}-b_{ik}a_{kj}big)Big(frac{partial}{partial x^{ij}}Big)_e.$$
The first step, I guess, is to find the extensions $X$ and $Y$ of $X_e$ and $Y_e$, respectively. I'm more or less convinced that for any $p=(p_{ij})in mathrm{GL}_n(mathbb{R})$, it holds that
$$X_p=sum_{i,j,k} p_{ik}a_{kj}Big(frac{partial}{partial x^{ij}}Big)_p.$$ Can we conclude from this that
$X=sum_{i,j,k} x_{ik}(square)a_{kj}Big(frac{partial}{partial x^{ij}}Big)_square$
holds?
I've then tried to compute $[X,Y]_e(f)$ for an arbtirary $fin C^infty(mathrm{GL}_n(mathbb{R}))$, using the formula in (1) above, and end up with the scary expression
$$[X,Y]_e(f)=Bigg(sum_{i,j} a_{ij} Big(frac{partial}{partial x^{ij}}Big)_eBigg)Bigg(Big(sum_{i,j,k} x_{ik}(square)b_{kj}Big(frac{partial}{partial x^{ij}}Big)_squareBig)(f)Bigg)-Bigg(sum_{i,j} b_{ij} Big(frac{partial}{partial x^{ij}}Big)_eBigg)Bigg(Big(sum_{i,j,k} x_{ik}(square)a_{kj}Big(frac{partial}{partial x^{ij}}Big)_squareBig)(f)Bigg),,$$
from which I have no idea where to go. Am I at all on the right track here? It feels like my main problem is that I get a little bit lost in all the notation and all identifications we make back and forth. Indeed, proofs of this fact can be found in many text books (e.g. Lee's Introduction to Smooth Manifolds p. 194), but the notation there tends to be too coarse for me to follow what is going on.
differential-geometry lie-groups lie-algebras smooth-manifolds
$endgroup$
add a comment |
$begingroup$
Notation/preliminaries.
Let $mathfrak{g}$ denote the Lie algebra (of left-invariant vector fields) on the Lie group $G$. Its Lie bracket $[.,.]colon mathfrak{g}timesmathfrak{g}tomathfrak{g}$ is defined by
$$[X,Y]_p(f)=X_p(Y_{square}(f))-Y_p(X_square(f))$$
for any vector fields $X,Yin C^{infty}(TG)$, any point $pin G$ and any smooth function $fin C^{infty}(G)$. Here, $X_square(f)colon Gtomathbb{R}$ is the smooth map defined by $qmapsto X_q(f)$.Let $T_eG$ denote the tangent space at the identify element $e$, consisting of all linear maps $C^{infty}(G)to mathbb{R}$ which satisfy the product rule. The tangent space is equipped with the Lie bracket $[![.,.]!]colon T_eGtimes T_eGto T_eG$ given by $[![X_e,Y_e]!]=[X,Y]_e$.
This gives us a Lie algebra isomorphism $mathfrak{g}cong T_eG$. More precisely, one can show that every tangent vector $X_ein T_eG$ can be extended in a unique way to a left-invariant vector field $Xin mathfrak{g}$.
For the Lie group $G=mathrm{GL}_n(mathbb{R})$, we can use $xcolon mathrm{GL}_n(mathbb{R})to mathbb{R}^{ntimes n}$ defined by $pmapsto p$ as global coordinates.
We will use $Big{Big(frac{partial}{partial x^{ij}}Big)_eBig}_{i,j=1}^n$ as a basis for $T_emathrm{GL}_n(mathbb{R})$. Here, $Big(frac{partial}{partial x^{ij}}Big)_e(f)=partial_{ij}(fcirc x^{-1})vert_{x(e)}$ for everh smooth function $fcolon mathrm{GL}_n(mathbb{R})tomathbb{R}$.
This gives rise to a vector space isomorphism $T_emathrm{GL}_n(mathbb{R})cong mathbb{R}^{ntimes n}$ via
$sum_{i,j} a_{ij}Big(frac{partial}{partial x^{ij}}Big)_emapsto (a_{ij})$.For any vector field $X$ on $mathrm{GL}_n(mathbb{R})$, we let $M_X$ denote the matrix associated to $X$ via the identifications $mathfrak{gl}_n(mathbb{R})cong T_emathrm{GL}_n(mathbb{R})cong mathbb{R}^{ntimes n}$.
Problem. I want to show that under the identifications $mathfrak{gl}_n(mathbb{R})cong T_emathrm{GL}_n(mathbb{R})cong mathbb{R}^{ntimes n}$, $[.,.]$ corresponds to the matrix commutator on $mathbb{R}^{ntimes n}$. Or more precisely:
For any vector fields $X$ and $Y$ on $mathrm{GL}_n(mathbb{R})$, it holds that
$M_{[X,Y]}=M_XM_Y-M_YM_X$.
Own attempt. I have realized that it suffices to show that for two tangent vectors $X_e=sum_{i,j} a_{ij} Big(frac{partial}{partial x^{ij}}Big)_e$ and $Y_e=sum_{i,j} b_{ij} Big(frac{partial}{partial x^{ij}}Big)_e$, it holds that
$$[X,Y]_e=sum_{i,j,k} big(a_{ik}b_{kj}-b_{ik}a_{kj}big)Big(frac{partial}{partial x^{ij}}Big)_e.$$
The first step, I guess, is to find the extensions $X$ and $Y$ of $X_e$ and $Y_e$, respectively. I'm more or less convinced that for any $p=(p_{ij})in mathrm{GL}_n(mathbb{R})$, it holds that
$$X_p=sum_{i,j,k} p_{ik}a_{kj}Big(frac{partial}{partial x^{ij}}Big)_p.$$ Can we conclude from this that
$X=sum_{i,j,k} x_{ik}(square)a_{kj}Big(frac{partial}{partial x^{ij}}Big)_square$
holds?
I've then tried to compute $[X,Y]_e(f)$ for an arbtirary $fin C^infty(mathrm{GL}_n(mathbb{R}))$, using the formula in (1) above, and end up with the scary expression
$$[X,Y]_e(f)=Bigg(sum_{i,j} a_{ij} Big(frac{partial}{partial x^{ij}}Big)_eBigg)Bigg(Big(sum_{i,j,k} x_{ik}(square)b_{kj}Big(frac{partial}{partial x^{ij}}Big)_squareBig)(f)Bigg)-Bigg(sum_{i,j} b_{ij} Big(frac{partial}{partial x^{ij}}Big)_eBigg)Bigg(Big(sum_{i,j,k} x_{ik}(square)a_{kj}Big(frac{partial}{partial x^{ij}}Big)_squareBig)(f)Bigg),,$$
from which I have no idea where to go. Am I at all on the right track here? It feels like my main problem is that I get a little bit lost in all the notation and all identifications we make back and forth. Indeed, proofs of this fact can be found in many text books (e.g. Lee's Introduction to Smooth Manifolds p. 194), but the notation there tends to be too coarse for me to follow what is going on.
differential-geometry lie-groups lie-algebras smooth-manifolds
$endgroup$
Notation/preliminaries.
Let $mathfrak{g}$ denote the Lie algebra (of left-invariant vector fields) on the Lie group $G$. Its Lie bracket $[.,.]colon mathfrak{g}timesmathfrak{g}tomathfrak{g}$ is defined by
$$[X,Y]_p(f)=X_p(Y_{square}(f))-Y_p(X_square(f))$$
for any vector fields $X,Yin C^{infty}(TG)$, any point $pin G$ and any smooth function $fin C^{infty}(G)$. Here, $X_square(f)colon Gtomathbb{R}$ is the smooth map defined by $qmapsto X_q(f)$.Let $T_eG$ denote the tangent space at the identify element $e$, consisting of all linear maps $C^{infty}(G)to mathbb{R}$ which satisfy the product rule. The tangent space is equipped with the Lie bracket $[![.,.]!]colon T_eGtimes T_eGto T_eG$ given by $[![X_e,Y_e]!]=[X,Y]_e$.
This gives us a Lie algebra isomorphism $mathfrak{g}cong T_eG$. More precisely, one can show that every tangent vector $X_ein T_eG$ can be extended in a unique way to a left-invariant vector field $Xin mathfrak{g}$.
For the Lie group $G=mathrm{GL}_n(mathbb{R})$, we can use $xcolon mathrm{GL}_n(mathbb{R})to mathbb{R}^{ntimes n}$ defined by $pmapsto p$ as global coordinates.
We will use $Big{Big(frac{partial}{partial x^{ij}}Big)_eBig}_{i,j=1}^n$ as a basis for $T_emathrm{GL}_n(mathbb{R})$. Here, $Big(frac{partial}{partial x^{ij}}Big)_e(f)=partial_{ij}(fcirc x^{-1})vert_{x(e)}$ for everh smooth function $fcolon mathrm{GL}_n(mathbb{R})tomathbb{R}$.
This gives rise to a vector space isomorphism $T_emathrm{GL}_n(mathbb{R})cong mathbb{R}^{ntimes n}$ via
$sum_{i,j} a_{ij}Big(frac{partial}{partial x^{ij}}Big)_emapsto (a_{ij})$.For any vector field $X$ on $mathrm{GL}_n(mathbb{R})$, we let $M_X$ denote the matrix associated to $X$ via the identifications $mathfrak{gl}_n(mathbb{R})cong T_emathrm{GL}_n(mathbb{R})cong mathbb{R}^{ntimes n}$.
Problem. I want to show that under the identifications $mathfrak{gl}_n(mathbb{R})cong T_emathrm{GL}_n(mathbb{R})cong mathbb{R}^{ntimes n}$, $[.,.]$ corresponds to the matrix commutator on $mathbb{R}^{ntimes n}$. Or more precisely:
For any vector fields $X$ and $Y$ on $mathrm{GL}_n(mathbb{R})$, it holds that
$M_{[X,Y]}=M_XM_Y-M_YM_X$.
Own attempt. I have realized that it suffices to show that for two tangent vectors $X_e=sum_{i,j} a_{ij} Big(frac{partial}{partial x^{ij}}Big)_e$ and $Y_e=sum_{i,j} b_{ij} Big(frac{partial}{partial x^{ij}}Big)_e$, it holds that
$$[X,Y]_e=sum_{i,j,k} big(a_{ik}b_{kj}-b_{ik}a_{kj}big)Big(frac{partial}{partial x^{ij}}Big)_e.$$
The first step, I guess, is to find the extensions $X$ and $Y$ of $X_e$ and $Y_e$, respectively. I'm more or less convinced that for any $p=(p_{ij})in mathrm{GL}_n(mathbb{R})$, it holds that
$$X_p=sum_{i,j,k} p_{ik}a_{kj}Big(frac{partial}{partial x^{ij}}Big)_p.$$ Can we conclude from this that
$X=sum_{i,j,k} x_{ik}(square)a_{kj}Big(frac{partial}{partial x^{ij}}Big)_square$
holds?
I've then tried to compute $[X,Y]_e(f)$ for an arbtirary $fin C^infty(mathrm{GL}_n(mathbb{R}))$, using the formula in (1) above, and end up with the scary expression
$$[X,Y]_e(f)=Bigg(sum_{i,j} a_{ij} Big(frac{partial}{partial x^{ij}}Big)_eBigg)Bigg(Big(sum_{i,j,k} x_{ik}(square)b_{kj}Big(frac{partial}{partial x^{ij}}Big)_squareBig)(f)Bigg)-Bigg(sum_{i,j} b_{ij} Big(frac{partial}{partial x^{ij}}Big)_eBigg)Bigg(Big(sum_{i,j,k} x_{ik}(square)a_{kj}Big(frac{partial}{partial x^{ij}}Big)_squareBig)(f)Bigg),,$$
from which I have no idea where to go. Am I at all on the right track here? It feels like my main problem is that I get a little bit lost in all the notation and all identifications we make back and forth. Indeed, proofs of this fact can be found in many text books (e.g. Lee's Introduction to Smooth Manifolds p. 194), but the notation there tends to be too coarse for me to follow what is going on.
differential-geometry lie-groups lie-algebras smooth-manifolds
differential-geometry lie-groups lie-algebras smooth-manifolds
edited Feb 1 at 2:30
Oskar Henriksson
asked Jan 28 at 6:39
Oskar HenrikssonOskar Henriksson
411316
411316
add a comment |
add a comment |
1 Answer
1
active
oldest
votes
$begingroup$
The "scary expression" will give you what you want, but you need to be careful with the names of the indices, you have some conflicts there, like $i$ appearing twice in not summed expressions.
Instead, let's streamline things a bit, for $A,Bin mathfrak g$ we have $$ A=a^{ij}partial_{ij}|_e, B=b^{ij}partial_{ij}|_e $$ with the components being constants, and summing over repeated indices are understood. A generic group element is denoted as $x^{ij}$ and $partial_{ij}$ are the holonomic frame vectors associated with the canonical coordinate system mapping group elements to their matrix elements.
We also recall that given any manifold $M$ with local chart $(U,varphi)$ (with $varphi(x)=(x^1(x),...,x^n(x))$), if two vector fields $X=X^ipartial_i$ and $Y=Y^ipartial_i$ are given, then their commutator is locally given by $$ [X,Y]=left(X^jpartial_j Y^i-Y^jpartial_j X^iright)partial_i. $$
Left multiplication:
Let $gamma:(-epsilon,epsilon)rightarrow G$ be a smooth curve such that $gamma(0)=x$ and $dotgamma(0)=X=X^{ij}partial_{ij}|_x$. Let $g=(g^{ij})in G$ be a group element. Then $$ (l_g)_ast X=frac{d}{dt}ggamma(t)|_{t=0}=^!gX=g^{ik}X^{kj}partial_{ij}|_{gx}, $$ where at the equality sign with the exclamation mark we use the fact that the group elements are just ordinary matrices embedded into $mathbb R^{ntimes n}$.
So left translation of vectors in $text{GL}(n,mathbb R)$ is just ordinary left multiplication of the vector (which is a matrix, rememeber!).
The derivation:
By the previous, the left invariant vector fields corresponding to $A,Binmathfrak g$ (also denoted the same way) are given by $$ A_x=x^{ik}a^{kj}partial_{ij}|_x B_x=x^{ik}b^{kj}partial_{ij}|_x. $$
If we now reinterpret the $x^{ij}$ from being specific variables to being coordinate functions, we can also write the vector fields without evaluation at a specific point as $$ A=x^{ik}a^{kj}partial_{ij} B=x^{ik}b^{kj}partial_{ij}. $$ the commutator is then $$ [A,B]=(A^{mn}partial_{mn}B^{ij}-B^{mn}partial_{mn}A^{ij})partial_{ij}, $$ where $$ A^{ij}=x^{ik}a^{kj}, $$ and similarly for $B$. This is $$ [A,B]=left( x^{mr}a^{rn}partial_{mn}(x^{ik}b^{kj})-x^{mr}b^{rn}partial_{mn}(x^{ik}a^{kj}) right)partial_{ij}=^!left(x^{mr}a^{rn}delta^i_mdelta^k_n b^{kj}-x^{mr}b^{rn}delta^i_mdelta^k_n a^{kj}right)partial_{ij} \ =left( x^{ir}a^{rk}b^{kj}-x^{ir}b^{rk}a^{kj} right)partial_{ij}. $$ At the equality with the exclamation mark we have used that $partial_{mn}x^{ij}=delta^i_mdelta^j_n$ and that the coefficients $a^{ij},b^{ij}$ are constants.
For the identity element we have $x^{ij}(e)=delta^{ij}$, so $$ [A,B]_e=left(a^{ik}b^{kj}-b^{ik}a^{kj}right)partial_{ij}|_e=[A_e,B_e], $$ where the last expression is the ordinary matrix commutator of the matrices $A_e,B_e$.
$endgroup$
$begingroup$
Thanks a lot! I'm bit confused by the equality $[X,Y]=(X^jpartial_jY^i-Y^jpartial_jX^i)partial_i$ though. I'm able to obtain $[X,Y]=X^jpartial_j(Y^ipartial_i)-Y^ipartial_i(X^jpartial_j)$, but I can't seem to get any further.
$endgroup$
– Oskar Henriksson
Jan 28 at 20:44
$begingroup$
Or wait, is this just the product rule? If I'm not mistaken, we get $$[X,Y]=X^jpartial_j (Y^i)partial_i+color{blue}{X^jY^ipartial_jpartial_i}-Y^ipartial_i (X^j)partial_j-color{blue}{Y^iX^jpartial_ipartial_j}$$ when we expand my expression. But for smooth functions $fin C^{infty}(U)$, it holds that $partial_ipartial_jf=partial_jpartial_if$. So the terms in blue cancel out, and we end up with $$[X,Y]=X^jpartial_j (Y^i)partial_i-Y^ipartial_i (X^j)partial_j=X^jpartial_j (Y^i)partial_i-Y^jpartial_j (X^i)partial_i,,$$ which is equal to your expression. Correct?
$endgroup$
– Oskar Henriksson
Jan 28 at 20:53
$begingroup$
@OskarHenriksson Yes, basically.
$endgroup$
– Bence Racskó
Jan 28 at 21:22
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3090558%2fthe-lie-bracket-of-mathfrakgl-n-mathbbr-is-the-matrix-commutator%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
The "scary expression" will give you what you want, but you need to be careful with the names of the indices, you have some conflicts there, like $i$ appearing twice in not summed expressions.
Instead, let's streamline things a bit, for $A,Bin mathfrak g$ we have $$ A=a^{ij}partial_{ij}|_e, B=b^{ij}partial_{ij}|_e $$ with the components being constants, and summing over repeated indices are understood. A generic group element is denoted as $x^{ij}$ and $partial_{ij}$ are the holonomic frame vectors associated with the canonical coordinate system mapping group elements to their matrix elements.
We also recall that given any manifold $M$ with local chart $(U,varphi)$ (with $varphi(x)=(x^1(x),...,x^n(x))$), if two vector fields $X=X^ipartial_i$ and $Y=Y^ipartial_i$ are given, then their commutator is locally given by $$ [X,Y]=left(X^jpartial_j Y^i-Y^jpartial_j X^iright)partial_i. $$
Left multiplication:
Let $gamma:(-epsilon,epsilon)rightarrow G$ be a smooth curve such that $gamma(0)=x$ and $dotgamma(0)=X=X^{ij}partial_{ij}|_x$. Let $g=(g^{ij})in G$ be a group element. Then $$ (l_g)_ast X=frac{d}{dt}ggamma(t)|_{t=0}=^!gX=g^{ik}X^{kj}partial_{ij}|_{gx}, $$ where at the equality sign with the exclamation mark we use the fact that the group elements are just ordinary matrices embedded into $mathbb R^{ntimes n}$.
So left translation of vectors in $text{GL}(n,mathbb R)$ is just ordinary left multiplication of the vector (which is a matrix, rememeber!).
The derivation:
By the previous, the left invariant vector fields corresponding to $A,Binmathfrak g$ (also denoted the same way) are given by $$ A_x=x^{ik}a^{kj}partial_{ij}|_x B_x=x^{ik}b^{kj}partial_{ij}|_x. $$
If we now reinterpret the $x^{ij}$ from being specific variables to being coordinate functions, we can also write the vector fields without evaluation at a specific point as $$ A=x^{ik}a^{kj}partial_{ij} B=x^{ik}b^{kj}partial_{ij}. $$ the commutator is then $$ [A,B]=(A^{mn}partial_{mn}B^{ij}-B^{mn}partial_{mn}A^{ij})partial_{ij}, $$ where $$ A^{ij}=x^{ik}a^{kj}, $$ and similarly for $B$. This is $$ [A,B]=left( x^{mr}a^{rn}partial_{mn}(x^{ik}b^{kj})-x^{mr}b^{rn}partial_{mn}(x^{ik}a^{kj}) right)partial_{ij}=^!left(x^{mr}a^{rn}delta^i_mdelta^k_n b^{kj}-x^{mr}b^{rn}delta^i_mdelta^k_n a^{kj}right)partial_{ij} \ =left( x^{ir}a^{rk}b^{kj}-x^{ir}b^{rk}a^{kj} right)partial_{ij}. $$ At the equality with the exclamation mark we have used that $partial_{mn}x^{ij}=delta^i_mdelta^j_n$ and that the coefficients $a^{ij},b^{ij}$ are constants.
For the identity element we have $x^{ij}(e)=delta^{ij}$, so $$ [A,B]_e=left(a^{ik}b^{kj}-b^{ik}a^{kj}right)partial_{ij}|_e=[A_e,B_e], $$ where the last expression is the ordinary matrix commutator of the matrices $A_e,B_e$.
$endgroup$
$begingroup$
Thanks a lot! I'm bit confused by the equality $[X,Y]=(X^jpartial_jY^i-Y^jpartial_jX^i)partial_i$ though. I'm able to obtain $[X,Y]=X^jpartial_j(Y^ipartial_i)-Y^ipartial_i(X^jpartial_j)$, but I can't seem to get any further.
$endgroup$
– Oskar Henriksson
Jan 28 at 20:44
$begingroup$
Or wait, is this just the product rule? If I'm not mistaken, we get $$[X,Y]=X^jpartial_j (Y^i)partial_i+color{blue}{X^jY^ipartial_jpartial_i}-Y^ipartial_i (X^j)partial_j-color{blue}{Y^iX^jpartial_ipartial_j}$$ when we expand my expression. But for smooth functions $fin C^{infty}(U)$, it holds that $partial_ipartial_jf=partial_jpartial_if$. So the terms in blue cancel out, and we end up with $$[X,Y]=X^jpartial_j (Y^i)partial_i-Y^ipartial_i (X^j)partial_j=X^jpartial_j (Y^i)partial_i-Y^jpartial_j (X^i)partial_i,,$$ which is equal to your expression. Correct?
$endgroup$
– Oskar Henriksson
Jan 28 at 20:53
$begingroup$
@OskarHenriksson Yes, basically.
$endgroup$
– Bence Racskó
Jan 28 at 21:22
add a comment |
$begingroup$
The "scary expression" will give you what you want, but you need to be careful with the names of the indices, you have some conflicts there, like $i$ appearing twice in not summed expressions.
Instead, let's streamline things a bit, for $A,Bin mathfrak g$ we have $$ A=a^{ij}partial_{ij}|_e, B=b^{ij}partial_{ij}|_e $$ with the components being constants, and summing over repeated indices are understood. A generic group element is denoted as $x^{ij}$ and $partial_{ij}$ are the holonomic frame vectors associated with the canonical coordinate system mapping group elements to their matrix elements.
We also recall that given any manifold $M$ with local chart $(U,varphi)$ (with $varphi(x)=(x^1(x),...,x^n(x))$), if two vector fields $X=X^ipartial_i$ and $Y=Y^ipartial_i$ are given, then their commutator is locally given by $$ [X,Y]=left(X^jpartial_j Y^i-Y^jpartial_j X^iright)partial_i. $$
Left multiplication:
Let $gamma:(-epsilon,epsilon)rightarrow G$ be a smooth curve such that $gamma(0)=x$ and $dotgamma(0)=X=X^{ij}partial_{ij}|_x$. Let $g=(g^{ij})in G$ be a group element. Then $$ (l_g)_ast X=frac{d}{dt}ggamma(t)|_{t=0}=^!gX=g^{ik}X^{kj}partial_{ij}|_{gx}, $$ where at the equality sign with the exclamation mark we use the fact that the group elements are just ordinary matrices embedded into $mathbb R^{ntimes n}$.
So left translation of vectors in $text{GL}(n,mathbb R)$ is just ordinary left multiplication of the vector (which is a matrix, rememeber!).
The derivation:
By the previous, the left invariant vector fields corresponding to $A,Binmathfrak g$ (also denoted the same way) are given by $$ A_x=x^{ik}a^{kj}partial_{ij}|_x B_x=x^{ik}b^{kj}partial_{ij}|_x. $$
If we now reinterpret the $x^{ij}$ from being specific variables to being coordinate functions, we can also write the vector fields without evaluation at a specific point as $$ A=x^{ik}a^{kj}partial_{ij} B=x^{ik}b^{kj}partial_{ij}. $$ the commutator is then $$ [A,B]=(A^{mn}partial_{mn}B^{ij}-B^{mn}partial_{mn}A^{ij})partial_{ij}, $$ where $$ A^{ij}=x^{ik}a^{kj}, $$ and similarly for $B$. This is $$ [A,B]=left( x^{mr}a^{rn}partial_{mn}(x^{ik}b^{kj})-x^{mr}b^{rn}partial_{mn}(x^{ik}a^{kj}) right)partial_{ij}=^!left(x^{mr}a^{rn}delta^i_mdelta^k_n b^{kj}-x^{mr}b^{rn}delta^i_mdelta^k_n a^{kj}right)partial_{ij} \ =left( x^{ir}a^{rk}b^{kj}-x^{ir}b^{rk}a^{kj} right)partial_{ij}. $$ At the equality with the exclamation mark we have used that $partial_{mn}x^{ij}=delta^i_mdelta^j_n$ and that the coefficients $a^{ij},b^{ij}$ are constants.
For the identity element we have $x^{ij}(e)=delta^{ij}$, so $$ [A,B]_e=left(a^{ik}b^{kj}-b^{ik}a^{kj}right)partial_{ij}|_e=[A_e,B_e], $$ where the last expression is the ordinary matrix commutator of the matrices $A_e,B_e$.
$endgroup$
$begingroup$
Thanks a lot! I'm bit confused by the equality $[X,Y]=(X^jpartial_jY^i-Y^jpartial_jX^i)partial_i$ though. I'm able to obtain $[X,Y]=X^jpartial_j(Y^ipartial_i)-Y^ipartial_i(X^jpartial_j)$, but I can't seem to get any further.
$endgroup$
– Oskar Henriksson
Jan 28 at 20:44
$begingroup$
Or wait, is this just the product rule? If I'm not mistaken, we get $$[X,Y]=X^jpartial_j (Y^i)partial_i+color{blue}{X^jY^ipartial_jpartial_i}-Y^ipartial_i (X^j)partial_j-color{blue}{Y^iX^jpartial_ipartial_j}$$ when we expand my expression. But for smooth functions $fin C^{infty}(U)$, it holds that $partial_ipartial_jf=partial_jpartial_if$. So the terms in blue cancel out, and we end up with $$[X,Y]=X^jpartial_j (Y^i)partial_i-Y^ipartial_i (X^j)partial_j=X^jpartial_j (Y^i)partial_i-Y^jpartial_j (X^i)partial_i,,$$ which is equal to your expression. Correct?
$endgroup$
– Oskar Henriksson
Jan 28 at 20:53
$begingroup$
@OskarHenriksson Yes, basically.
$endgroup$
– Bence Racskó
Jan 28 at 21:22
add a comment |
$begingroup$
The "scary expression" will give you what you want, but you need to be careful with the names of the indices, you have some conflicts there, like $i$ appearing twice in not summed expressions.
Instead, let's streamline things a bit, for $A,Bin mathfrak g$ we have $$ A=a^{ij}partial_{ij}|_e, B=b^{ij}partial_{ij}|_e $$ with the components being constants, and summing over repeated indices are understood. A generic group element is denoted as $x^{ij}$ and $partial_{ij}$ are the holonomic frame vectors associated with the canonical coordinate system mapping group elements to their matrix elements.
We also recall that given any manifold $M$ with local chart $(U,varphi)$ (with $varphi(x)=(x^1(x),...,x^n(x))$), if two vector fields $X=X^ipartial_i$ and $Y=Y^ipartial_i$ are given, then their commutator is locally given by $$ [X,Y]=left(X^jpartial_j Y^i-Y^jpartial_j X^iright)partial_i. $$
Left multiplication:
Let $gamma:(-epsilon,epsilon)rightarrow G$ be a smooth curve such that $gamma(0)=x$ and $dotgamma(0)=X=X^{ij}partial_{ij}|_x$. Let $g=(g^{ij})in G$ be a group element. Then $$ (l_g)_ast X=frac{d}{dt}ggamma(t)|_{t=0}=^!gX=g^{ik}X^{kj}partial_{ij}|_{gx}, $$ where at the equality sign with the exclamation mark we use the fact that the group elements are just ordinary matrices embedded into $mathbb R^{ntimes n}$.
So left translation of vectors in $text{GL}(n,mathbb R)$ is just ordinary left multiplication of the vector (which is a matrix, rememeber!).
The derivation:
By the previous, the left invariant vector fields corresponding to $A,Binmathfrak g$ (also denoted the same way) are given by $$ A_x=x^{ik}a^{kj}partial_{ij}|_x B_x=x^{ik}b^{kj}partial_{ij}|_x. $$
If we now reinterpret the $x^{ij}$ from being specific variables to being coordinate functions, we can also write the vector fields without evaluation at a specific point as $$ A=x^{ik}a^{kj}partial_{ij} B=x^{ik}b^{kj}partial_{ij}. $$ the commutator is then $$ [A,B]=(A^{mn}partial_{mn}B^{ij}-B^{mn}partial_{mn}A^{ij})partial_{ij}, $$ where $$ A^{ij}=x^{ik}a^{kj}, $$ and similarly for $B$. This is $$ [A,B]=left( x^{mr}a^{rn}partial_{mn}(x^{ik}b^{kj})-x^{mr}b^{rn}partial_{mn}(x^{ik}a^{kj}) right)partial_{ij}=^!left(x^{mr}a^{rn}delta^i_mdelta^k_n b^{kj}-x^{mr}b^{rn}delta^i_mdelta^k_n a^{kj}right)partial_{ij} \ =left( x^{ir}a^{rk}b^{kj}-x^{ir}b^{rk}a^{kj} right)partial_{ij}. $$ At the equality with the exclamation mark we have used that $partial_{mn}x^{ij}=delta^i_mdelta^j_n$ and that the coefficients $a^{ij},b^{ij}$ are constants.
For the identity element we have $x^{ij}(e)=delta^{ij}$, so $$ [A,B]_e=left(a^{ik}b^{kj}-b^{ik}a^{kj}right)partial_{ij}|_e=[A_e,B_e], $$ where the last expression is the ordinary matrix commutator of the matrices $A_e,B_e$.
$endgroup$
The "scary expression" will give you what you want, but you need to be careful with the names of the indices, you have some conflicts there, like $i$ appearing twice in not summed expressions.
Instead, let's streamline things a bit, for $A,Bin mathfrak g$ we have $$ A=a^{ij}partial_{ij}|_e, B=b^{ij}partial_{ij}|_e $$ with the components being constants, and summing over repeated indices are understood. A generic group element is denoted as $x^{ij}$ and $partial_{ij}$ are the holonomic frame vectors associated with the canonical coordinate system mapping group elements to their matrix elements.
We also recall that given any manifold $M$ with local chart $(U,varphi)$ (with $varphi(x)=(x^1(x),...,x^n(x))$), if two vector fields $X=X^ipartial_i$ and $Y=Y^ipartial_i$ are given, then their commutator is locally given by $$ [X,Y]=left(X^jpartial_j Y^i-Y^jpartial_j X^iright)partial_i. $$
Left multiplication:
Let $gamma:(-epsilon,epsilon)rightarrow G$ be a smooth curve such that $gamma(0)=x$ and $dotgamma(0)=X=X^{ij}partial_{ij}|_x$. Let $g=(g^{ij})in G$ be a group element. Then $$ (l_g)_ast X=frac{d}{dt}ggamma(t)|_{t=0}=^!gX=g^{ik}X^{kj}partial_{ij}|_{gx}, $$ where at the equality sign with the exclamation mark we use the fact that the group elements are just ordinary matrices embedded into $mathbb R^{ntimes n}$.
So left translation of vectors in $text{GL}(n,mathbb R)$ is just ordinary left multiplication of the vector (which is a matrix, rememeber!).
The derivation:
By the previous, the left invariant vector fields corresponding to $A,Binmathfrak g$ (also denoted the same way) are given by $$ A_x=x^{ik}a^{kj}partial_{ij}|_x B_x=x^{ik}b^{kj}partial_{ij}|_x. $$
If we now reinterpret the $x^{ij}$ from being specific variables to being coordinate functions, we can also write the vector fields without evaluation at a specific point as $$ A=x^{ik}a^{kj}partial_{ij} B=x^{ik}b^{kj}partial_{ij}. $$ the commutator is then $$ [A,B]=(A^{mn}partial_{mn}B^{ij}-B^{mn}partial_{mn}A^{ij})partial_{ij}, $$ where $$ A^{ij}=x^{ik}a^{kj}, $$ and similarly for $B$. This is $$ [A,B]=left( x^{mr}a^{rn}partial_{mn}(x^{ik}b^{kj})-x^{mr}b^{rn}partial_{mn}(x^{ik}a^{kj}) right)partial_{ij}=^!left(x^{mr}a^{rn}delta^i_mdelta^k_n b^{kj}-x^{mr}b^{rn}delta^i_mdelta^k_n a^{kj}right)partial_{ij} \ =left( x^{ir}a^{rk}b^{kj}-x^{ir}b^{rk}a^{kj} right)partial_{ij}. $$ At the equality with the exclamation mark we have used that $partial_{mn}x^{ij}=delta^i_mdelta^j_n$ and that the coefficients $a^{ij},b^{ij}$ are constants.
For the identity element we have $x^{ij}(e)=delta^{ij}$, so $$ [A,B]_e=left(a^{ik}b^{kj}-b^{ik}a^{kj}right)partial_{ij}|_e=[A_e,B_e], $$ where the last expression is the ordinary matrix commutator of the matrices $A_e,B_e$.
answered Jan 28 at 15:06
Bence RacskóBence Racskó
3,438823
3,438823
$begingroup$
Thanks a lot! I'm bit confused by the equality $[X,Y]=(X^jpartial_jY^i-Y^jpartial_jX^i)partial_i$ though. I'm able to obtain $[X,Y]=X^jpartial_j(Y^ipartial_i)-Y^ipartial_i(X^jpartial_j)$, but I can't seem to get any further.
$endgroup$
– Oskar Henriksson
Jan 28 at 20:44
$begingroup$
Or wait, is this just the product rule? If I'm not mistaken, we get $$[X,Y]=X^jpartial_j (Y^i)partial_i+color{blue}{X^jY^ipartial_jpartial_i}-Y^ipartial_i (X^j)partial_j-color{blue}{Y^iX^jpartial_ipartial_j}$$ when we expand my expression. But for smooth functions $fin C^{infty}(U)$, it holds that $partial_ipartial_jf=partial_jpartial_if$. So the terms in blue cancel out, and we end up with $$[X,Y]=X^jpartial_j (Y^i)partial_i-Y^ipartial_i (X^j)partial_j=X^jpartial_j (Y^i)partial_i-Y^jpartial_j (X^i)partial_i,,$$ which is equal to your expression. Correct?
$endgroup$
– Oskar Henriksson
Jan 28 at 20:53
$begingroup$
@OskarHenriksson Yes, basically.
$endgroup$
– Bence Racskó
Jan 28 at 21:22
add a comment |
$begingroup$
Thanks a lot! I'm bit confused by the equality $[X,Y]=(X^jpartial_jY^i-Y^jpartial_jX^i)partial_i$ though. I'm able to obtain $[X,Y]=X^jpartial_j(Y^ipartial_i)-Y^ipartial_i(X^jpartial_j)$, but I can't seem to get any further.
$endgroup$
– Oskar Henriksson
Jan 28 at 20:44
$begingroup$
Or wait, is this just the product rule? If I'm not mistaken, we get $$[X,Y]=X^jpartial_j (Y^i)partial_i+color{blue}{X^jY^ipartial_jpartial_i}-Y^ipartial_i (X^j)partial_j-color{blue}{Y^iX^jpartial_ipartial_j}$$ when we expand my expression. But for smooth functions $fin C^{infty}(U)$, it holds that $partial_ipartial_jf=partial_jpartial_if$. So the terms in blue cancel out, and we end up with $$[X,Y]=X^jpartial_j (Y^i)partial_i-Y^ipartial_i (X^j)partial_j=X^jpartial_j (Y^i)partial_i-Y^jpartial_j (X^i)partial_i,,$$ which is equal to your expression. Correct?
$endgroup$
– Oskar Henriksson
Jan 28 at 20:53
$begingroup$
@OskarHenriksson Yes, basically.
$endgroup$
– Bence Racskó
Jan 28 at 21:22
$begingroup$
Thanks a lot! I'm bit confused by the equality $[X,Y]=(X^jpartial_jY^i-Y^jpartial_jX^i)partial_i$ though. I'm able to obtain $[X,Y]=X^jpartial_j(Y^ipartial_i)-Y^ipartial_i(X^jpartial_j)$, but I can't seem to get any further.
$endgroup$
– Oskar Henriksson
Jan 28 at 20:44
$begingroup$
Thanks a lot! I'm bit confused by the equality $[X,Y]=(X^jpartial_jY^i-Y^jpartial_jX^i)partial_i$ though. I'm able to obtain $[X,Y]=X^jpartial_j(Y^ipartial_i)-Y^ipartial_i(X^jpartial_j)$, but I can't seem to get any further.
$endgroup$
– Oskar Henriksson
Jan 28 at 20:44
$begingroup$
Or wait, is this just the product rule? If I'm not mistaken, we get $$[X,Y]=X^jpartial_j (Y^i)partial_i+color{blue}{X^jY^ipartial_jpartial_i}-Y^ipartial_i (X^j)partial_j-color{blue}{Y^iX^jpartial_ipartial_j}$$ when we expand my expression. But for smooth functions $fin C^{infty}(U)$, it holds that $partial_ipartial_jf=partial_jpartial_if$. So the terms in blue cancel out, and we end up with $$[X,Y]=X^jpartial_j (Y^i)partial_i-Y^ipartial_i (X^j)partial_j=X^jpartial_j (Y^i)partial_i-Y^jpartial_j (X^i)partial_i,,$$ which is equal to your expression. Correct?
$endgroup$
– Oskar Henriksson
Jan 28 at 20:53
$begingroup$
Or wait, is this just the product rule? If I'm not mistaken, we get $$[X,Y]=X^jpartial_j (Y^i)partial_i+color{blue}{X^jY^ipartial_jpartial_i}-Y^ipartial_i (X^j)partial_j-color{blue}{Y^iX^jpartial_ipartial_j}$$ when we expand my expression. But for smooth functions $fin C^{infty}(U)$, it holds that $partial_ipartial_jf=partial_jpartial_if$. So the terms in blue cancel out, and we end up with $$[X,Y]=X^jpartial_j (Y^i)partial_i-Y^ipartial_i (X^j)partial_j=X^jpartial_j (Y^i)partial_i-Y^jpartial_j (X^i)partial_i,,$$ which is equal to your expression. Correct?
$endgroup$
– Oskar Henriksson
Jan 28 at 20:53
$begingroup$
@OskarHenriksson Yes, basically.
$endgroup$
– Bence Racskó
Jan 28 at 21:22
$begingroup$
@OskarHenriksson Yes, basically.
$endgroup$
– Bence Racskó
Jan 28 at 21:22
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3090558%2fthe-lie-bracket-of-mathfrakgl-n-mathbbr-is-the-matrix-commutator%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown