zeros of $x^*Ax$, a quadratic form
$begingroup$
The question hopefully says it all!
We have a Hermitian matrix $A=A^* in mathbb{C}^n$ and a quadratic form: $f(x)=x^*Ax,~xin mathbb{C}^n$
We want to find the solution of $f(x) = x^*Ax = 0$
When the matrix is positive semi definite (p.s.d.), the solution seems to be null-space of the matrix of $A$. This I found by diagonalising $A$. But suddenly I became helpless when $A$ has negative Eigen values too!
Please note that when the matrix is not p.s.d., the solution space contains the null space. (I.e. null space is always a solution)
linear-algebra matrices eigenvalues-eigenvectors quadratic-forms matrix-decomposition
$endgroup$
add a comment |
$begingroup$
The question hopefully says it all!
We have a Hermitian matrix $A=A^* in mathbb{C}^n$ and a quadratic form: $f(x)=x^*Ax,~xin mathbb{C}^n$
We want to find the solution of $f(x) = x^*Ax = 0$
When the matrix is positive semi definite (p.s.d.), the solution seems to be null-space of the matrix of $A$. This I found by diagonalising $A$. But suddenly I became helpless when $A$ has negative Eigen values too!
Please note that when the matrix is not p.s.d., the solution space contains the null space. (I.e. null space is always a solution)
linear-algebra matrices eigenvalues-eigenvectors quadratic-forms matrix-decomposition
$endgroup$
$begingroup$
Have you tried solving this in the case that $A$ is diagonal?
$endgroup$
– Omnomnomnom
Mar 20 '15 at 17:21
$begingroup$
In fact, it suffices to solve this in the case that $A$ has elements $pm 1$ and $0$.
$endgroup$
– Omnomnomnom
Mar 20 '15 at 17:23
$begingroup$
@omnomnomnom Thanks for the comment. That's exactly where my solution stage is. This tri element diagonal case gives me an equation with no further insights!
$endgroup$
– Loves Probability
Mar 20 '15 at 23:01
$begingroup$
For example, the solution for the matrix $$pmatrix{1\&1\&&-1}$$ is the cone $x^2+y^2=z^2$. Note that this is not a linear subspace.
$endgroup$
– Omnomnomnom
Mar 21 '15 at 5:07
add a comment |
$begingroup$
The question hopefully says it all!
We have a Hermitian matrix $A=A^* in mathbb{C}^n$ and a quadratic form: $f(x)=x^*Ax,~xin mathbb{C}^n$
We want to find the solution of $f(x) = x^*Ax = 0$
When the matrix is positive semi definite (p.s.d.), the solution seems to be null-space of the matrix of $A$. This I found by diagonalising $A$. But suddenly I became helpless when $A$ has negative Eigen values too!
Please note that when the matrix is not p.s.d., the solution space contains the null space. (I.e. null space is always a solution)
linear-algebra matrices eigenvalues-eigenvectors quadratic-forms matrix-decomposition
$endgroup$
The question hopefully says it all!
We have a Hermitian matrix $A=A^* in mathbb{C}^n$ and a quadratic form: $f(x)=x^*Ax,~xin mathbb{C}^n$
We want to find the solution of $f(x) = x^*Ax = 0$
When the matrix is positive semi definite (p.s.d.), the solution seems to be null-space of the matrix of $A$. This I found by diagonalising $A$. But suddenly I became helpless when $A$ has negative Eigen values too!
Please note that when the matrix is not p.s.d., the solution space contains the null space. (I.e. null space is always a solution)
linear-algebra matrices eigenvalues-eigenvectors quadratic-forms matrix-decomposition
linear-algebra matrices eigenvalues-eigenvectors quadratic-forms matrix-decomposition
edited Mar 20 '15 at 14:34
Loves Probability
asked Mar 20 '15 at 14:20
Loves ProbabilityLoves Probability
15319
15319
$begingroup$
Have you tried solving this in the case that $A$ is diagonal?
$endgroup$
– Omnomnomnom
Mar 20 '15 at 17:21
$begingroup$
In fact, it suffices to solve this in the case that $A$ has elements $pm 1$ and $0$.
$endgroup$
– Omnomnomnom
Mar 20 '15 at 17:23
$begingroup$
@omnomnomnom Thanks for the comment. That's exactly where my solution stage is. This tri element diagonal case gives me an equation with no further insights!
$endgroup$
– Loves Probability
Mar 20 '15 at 23:01
$begingroup$
For example, the solution for the matrix $$pmatrix{1\&1\&&-1}$$ is the cone $x^2+y^2=z^2$. Note that this is not a linear subspace.
$endgroup$
– Omnomnomnom
Mar 21 '15 at 5:07
add a comment |
$begingroup$
Have you tried solving this in the case that $A$ is diagonal?
$endgroup$
– Omnomnomnom
Mar 20 '15 at 17:21
$begingroup$
In fact, it suffices to solve this in the case that $A$ has elements $pm 1$ and $0$.
$endgroup$
– Omnomnomnom
Mar 20 '15 at 17:23
$begingroup$
@omnomnomnom Thanks for the comment. That's exactly where my solution stage is. This tri element diagonal case gives me an equation with no further insights!
$endgroup$
– Loves Probability
Mar 20 '15 at 23:01
$begingroup$
For example, the solution for the matrix $$pmatrix{1\&1\&&-1}$$ is the cone $x^2+y^2=z^2$. Note that this is not a linear subspace.
$endgroup$
– Omnomnomnom
Mar 21 '15 at 5:07
$begingroup$
Have you tried solving this in the case that $A$ is diagonal?
$endgroup$
– Omnomnomnom
Mar 20 '15 at 17:21
$begingroup$
Have you tried solving this in the case that $A$ is diagonal?
$endgroup$
– Omnomnomnom
Mar 20 '15 at 17:21
$begingroup$
In fact, it suffices to solve this in the case that $A$ has elements $pm 1$ and $0$.
$endgroup$
– Omnomnomnom
Mar 20 '15 at 17:23
$begingroup$
In fact, it suffices to solve this in the case that $A$ has elements $pm 1$ and $0$.
$endgroup$
– Omnomnomnom
Mar 20 '15 at 17:23
$begingroup$
@omnomnomnom Thanks for the comment. That's exactly where my solution stage is. This tri element diagonal case gives me an equation with no further insights!
$endgroup$
– Loves Probability
Mar 20 '15 at 23:01
$begingroup$
@omnomnomnom Thanks for the comment. That's exactly where my solution stage is. This tri element diagonal case gives me an equation with no further insights!
$endgroup$
– Loves Probability
Mar 20 '15 at 23:01
$begingroup$
For example, the solution for the matrix $$pmatrix{1\&1\&&-1}$$ is the cone $x^2+y^2=z^2$. Note that this is not a linear subspace.
$endgroup$
– Omnomnomnom
Mar 21 '15 at 5:07
$begingroup$
For example, the solution for the matrix $$pmatrix{1\&1\&&-1}$$ is the cone $x^2+y^2=z^2$. Note that this is not a linear subspace.
$endgroup$
– Omnomnomnom
Mar 21 '15 at 5:07
add a comment |
1 Answer
1
active
oldest
votes
$begingroup$
Consider $Ntimes N$ matrices and $Ntimes 1$ vectors. Let $A=ULambda U^H$ be its eigen-decomposition. Then
begin{align}
x^HAx &=y^HLambda y && {text{where I define }y=Ux,~forall x } \
&=sum_{i=1}^{N}|y_i|^2lambda_i \
&= theta^Tlambda
end{align}
where $lambda$ is the vector with all eigenvalues and $theta$ is any vector whose entries are non-negative. Thus, for any $thetainleft(mathcal{N}(lambda)capmathbb{R}_{+}^Nright)$ where $mathcal{N}(lambda)$ is the null-space of $lambda$ vector (set of all vectors orthogonal to $lambda$) and the non-negative $N$-dimensional orthant (non-negative quadrant where all entries are non-negative), we have $theta^Tlambda=0$. Now consider the set $mathcal{D}$ of all diagonal matrices such that the diagonal entries lie on the unit circle in the complex plane. Then your solution set is
begin{align}
mathcal{S}_x,=,{U^HDsqrt{theta} ,mid,forall Dinmathcal{D}~,~forall thetainleft(mathcal{N}(lambda)capmathbb{R}_{+}^Nright) }
end{align}
where $sqrt{theta}$ is the entry-wise square root of $theta$.
$mathcal{D}$ is needed because it doesn't matter what the phase of each entry of $y$ is. Note that $y_{i}=D_{ii}sqrt{theta_i}$ and $|y_i|^2=|D_{ii}|^2theta_i=theta_i$. The difficult part is that $mathcal{S}_x$ is not a linear subspace and you won't have that nice properties. It is a highly non-linear transformation and I am not sure what intuitive sense you can derive out of it.
$endgroup$
$begingroup$
Thanks for the solution. This does give a good insight. But $mathcal{N}(lambda)bigcapmathbb{R}_{+}^{N}$ is not the solution of $y$ (but a solution of $ycdot^* y$), let alone desired $x$! (The question actually asks about $x$, as the zeros of the quadratic $x^*Ax$) Ofcourse, $y$ is a simple but non-linear mapping from $mathcal{N}(lambda)bigcapmathbb{R}_{+}^{N}$. But there is another unitary transformation to be applied after this non-linear transform. What does that all finally mean, is the mystery still!
$endgroup$
– Loves Probability
Mar 23 '15 at 15:22
$begingroup$
By the way, hope you remember me, Dileep! (I am Harish, who joined in ME(Telecom) along with you @IISc. I did my project under Prof. VS (2009-11 batch))
$endgroup$
– Loves Probability
Mar 23 '15 at 15:24
$begingroup$
You are right. It is far from stating the answer. I thought the reverse transformation is obvious. I have now explicitly stated it.
$endgroup$
– dineshdileep
Mar 24 '15 at 5:17
$begingroup$
Yes, I do know you harish. We were neighbors in hostel as well :):).
$endgroup$
– dineshdileep
Mar 24 '15 at 5:18
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f1198427%2fzeros-of-xax-a-quadratic-form%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
Consider $Ntimes N$ matrices and $Ntimes 1$ vectors. Let $A=ULambda U^H$ be its eigen-decomposition. Then
begin{align}
x^HAx &=y^HLambda y && {text{where I define }y=Ux,~forall x } \
&=sum_{i=1}^{N}|y_i|^2lambda_i \
&= theta^Tlambda
end{align}
where $lambda$ is the vector with all eigenvalues and $theta$ is any vector whose entries are non-negative. Thus, for any $thetainleft(mathcal{N}(lambda)capmathbb{R}_{+}^Nright)$ where $mathcal{N}(lambda)$ is the null-space of $lambda$ vector (set of all vectors orthogonal to $lambda$) and the non-negative $N$-dimensional orthant (non-negative quadrant where all entries are non-negative), we have $theta^Tlambda=0$. Now consider the set $mathcal{D}$ of all diagonal matrices such that the diagonal entries lie on the unit circle in the complex plane. Then your solution set is
begin{align}
mathcal{S}_x,=,{U^HDsqrt{theta} ,mid,forall Dinmathcal{D}~,~forall thetainleft(mathcal{N}(lambda)capmathbb{R}_{+}^Nright) }
end{align}
where $sqrt{theta}$ is the entry-wise square root of $theta$.
$mathcal{D}$ is needed because it doesn't matter what the phase of each entry of $y$ is. Note that $y_{i}=D_{ii}sqrt{theta_i}$ and $|y_i|^2=|D_{ii}|^2theta_i=theta_i$. The difficult part is that $mathcal{S}_x$ is not a linear subspace and you won't have that nice properties. It is a highly non-linear transformation and I am not sure what intuitive sense you can derive out of it.
$endgroup$
$begingroup$
Thanks for the solution. This does give a good insight. But $mathcal{N}(lambda)bigcapmathbb{R}_{+}^{N}$ is not the solution of $y$ (but a solution of $ycdot^* y$), let alone desired $x$! (The question actually asks about $x$, as the zeros of the quadratic $x^*Ax$) Ofcourse, $y$ is a simple but non-linear mapping from $mathcal{N}(lambda)bigcapmathbb{R}_{+}^{N}$. But there is another unitary transformation to be applied after this non-linear transform. What does that all finally mean, is the mystery still!
$endgroup$
– Loves Probability
Mar 23 '15 at 15:22
$begingroup$
By the way, hope you remember me, Dileep! (I am Harish, who joined in ME(Telecom) along with you @IISc. I did my project under Prof. VS (2009-11 batch))
$endgroup$
– Loves Probability
Mar 23 '15 at 15:24
$begingroup$
You are right. It is far from stating the answer. I thought the reverse transformation is obvious. I have now explicitly stated it.
$endgroup$
– dineshdileep
Mar 24 '15 at 5:17
$begingroup$
Yes, I do know you harish. We were neighbors in hostel as well :):).
$endgroup$
– dineshdileep
Mar 24 '15 at 5:18
add a comment |
$begingroup$
Consider $Ntimes N$ matrices and $Ntimes 1$ vectors. Let $A=ULambda U^H$ be its eigen-decomposition. Then
begin{align}
x^HAx &=y^HLambda y && {text{where I define }y=Ux,~forall x } \
&=sum_{i=1}^{N}|y_i|^2lambda_i \
&= theta^Tlambda
end{align}
where $lambda$ is the vector with all eigenvalues and $theta$ is any vector whose entries are non-negative. Thus, for any $thetainleft(mathcal{N}(lambda)capmathbb{R}_{+}^Nright)$ where $mathcal{N}(lambda)$ is the null-space of $lambda$ vector (set of all vectors orthogonal to $lambda$) and the non-negative $N$-dimensional orthant (non-negative quadrant where all entries are non-negative), we have $theta^Tlambda=0$. Now consider the set $mathcal{D}$ of all diagonal matrices such that the diagonal entries lie on the unit circle in the complex plane. Then your solution set is
begin{align}
mathcal{S}_x,=,{U^HDsqrt{theta} ,mid,forall Dinmathcal{D}~,~forall thetainleft(mathcal{N}(lambda)capmathbb{R}_{+}^Nright) }
end{align}
where $sqrt{theta}$ is the entry-wise square root of $theta$.
$mathcal{D}$ is needed because it doesn't matter what the phase of each entry of $y$ is. Note that $y_{i}=D_{ii}sqrt{theta_i}$ and $|y_i|^2=|D_{ii}|^2theta_i=theta_i$. The difficult part is that $mathcal{S}_x$ is not a linear subspace and you won't have that nice properties. It is a highly non-linear transformation and I am not sure what intuitive sense you can derive out of it.
$endgroup$
$begingroup$
Thanks for the solution. This does give a good insight. But $mathcal{N}(lambda)bigcapmathbb{R}_{+}^{N}$ is not the solution of $y$ (but a solution of $ycdot^* y$), let alone desired $x$! (The question actually asks about $x$, as the zeros of the quadratic $x^*Ax$) Ofcourse, $y$ is a simple but non-linear mapping from $mathcal{N}(lambda)bigcapmathbb{R}_{+}^{N}$. But there is another unitary transformation to be applied after this non-linear transform. What does that all finally mean, is the mystery still!
$endgroup$
– Loves Probability
Mar 23 '15 at 15:22
$begingroup$
By the way, hope you remember me, Dileep! (I am Harish, who joined in ME(Telecom) along with you @IISc. I did my project under Prof. VS (2009-11 batch))
$endgroup$
– Loves Probability
Mar 23 '15 at 15:24
$begingroup$
You are right. It is far from stating the answer. I thought the reverse transformation is obvious. I have now explicitly stated it.
$endgroup$
– dineshdileep
Mar 24 '15 at 5:17
$begingroup$
Yes, I do know you harish. We were neighbors in hostel as well :):).
$endgroup$
– dineshdileep
Mar 24 '15 at 5:18
add a comment |
$begingroup$
Consider $Ntimes N$ matrices and $Ntimes 1$ vectors. Let $A=ULambda U^H$ be its eigen-decomposition. Then
begin{align}
x^HAx &=y^HLambda y && {text{where I define }y=Ux,~forall x } \
&=sum_{i=1}^{N}|y_i|^2lambda_i \
&= theta^Tlambda
end{align}
where $lambda$ is the vector with all eigenvalues and $theta$ is any vector whose entries are non-negative. Thus, for any $thetainleft(mathcal{N}(lambda)capmathbb{R}_{+}^Nright)$ where $mathcal{N}(lambda)$ is the null-space of $lambda$ vector (set of all vectors orthogonal to $lambda$) and the non-negative $N$-dimensional orthant (non-negative quadrant where all entries are non-negative), we have $theta^Tlambda=0$. Now consider the set $mathcal{D}$ of all diagonal matrices such that the diagonal entries lie on the unit circle in the complex plane. Then your solution set is
begin{align}
mathcal{S}_x,=,{U^HDsqrt{theta} ,mid,forall Dinmathcal{D}~,~forall thetainleft(mathcal{N}(lambda)capmathbb{R}_{+}^Nright) }
end{align}
where $sqrt{theta}$ is the entry-wise square root of $theta$.
$mathcal{D}$ is needed because it doesn't matter what the phase of each entry of $y$ is. Note that $y_{i}=D_{ii}sqrt{theta_i}$ and $|y_i|^2=|D_{ii}|^2theta_i=theta_i$. The difficult part is that $mathcal{S}_x$ is not a linear subspace and you won't have that nice properties. It is a highly non-linear transformation and I am not sure what intuitive sense you can derive out of it.
$endgroup$
Consider $Ntimes N$ matrices and $Ntimes 1$ vectors. Let $A=ULambda U^H$ be its eigen-decomposition. Then
begin{align}
x^HAx &=y^HLambda y && {text{where I define }y=Ux,~forall x } \
&=sum_{i=1}^{N}|y_i|^2lambda_i \
&= theta^Tlambda
end{align}
where $lambda$ is the vector with all eigenvalues and $theta$ is any vector whose entries are non-negative. Thus, for any $thetainleft(mathcal{N}(lambda)capmathbb{R}_{+}^Nright)$ where $mathcal{N}(lambda)$ is the null-space of $lambda$ vector (set of all vectors orthogonal to $lambda$) and the non-negative $N$-dimensional orthant (non-negative quadrant where all entries are non-negative), we have $theta^Tlambda=0$. Now consider the set $mathcal{D}$ of all diagonal matrices such that the diagonal entries lie on the unit circle in the complex plane. Then your solution set is
begin{align}
mathcal{S}_x,=,{U^HDsqrt{theta} ,mid,forall Dinmathcal{D}~,~forall thetainleft(mathcal{N}(lambda)capmathbb{R}_{+}^Nright) }
end{align}
where $sqrt{theta}$ is the entry-wise square root of $theta$.
$mathcal{D}$ is needed because it doesn't matter what the phase of each entry of $y$ is. Note that $y_{i}=D_{ii}sqrt{theta_i}$ and $|y_i|^2=|D_{ii}|^2theta_i=theta_i$. The difficult part is that $mathcal{S}_x$ is not a linear subspace and you won't have that nice properties. It is a highly non-linear transformation and I am not sure what intuitive sense you can derive out of it.
edited Mar 24 '15 at 5:14
answered Mar 23 '15 at 13:53
dineshdileepdineshdileep
5,96111735
5,96111735
$begingroup$
Thanks for the solution. This does give a good insight. But $mathcal{N}(lambda)bigcapmathbb{R}_{+}^{N}$ is not the solution of $y$ (but a solution of $ycdot^* y$), let alone desired $x$! (The question actually asks about $x$, as the zeros of the quadratic $x^*Ax$) Ofcourse, $y$ is a simple but non-linear mapping from $mathcal{N}(lambda)bigcapmathbb{R}_{+}^{N}$. But there is another unitary transformation to be applied after this non-linear transform. What does that all finally mean, is the mystery still!
$endgroup$
– Loves Probability
Mar 23 '15 at 15:22
$begingroup$
By the way, hope you remember me, Dileep! (I am Harish, who joined in ME(Telecom) along with you @IISc. I did my project under Prof. VS (2009-11 batch))
$endgroup$
– Loves Probability
Mar 23 '15 at 15:24
$begingroup$
You are right. It is far from stating the answer. I thought the reverse transformation is obvious. I have now explicitly stated it.
$endgroup$
– dineshdileep
Mar 24 '15 at 5:17
$begingroup$
Yes, I do know you harish. We were neighbors in hostel as well :):).
$endgroup$
– dineshdileep
Mar 24 '15 at 5:18
add a comment |
$begingroup$
Thanks for the solution. This does give a good insight. But $mathcal{N}(lambda)bigcapmathbb{R}_{+}^{N}$ is not the solution of $y$ (but a solution of $ycdot^* y$), let alone desired $x$! (The question actually asks about $x$, as the zeros of the quadratic $x^*Ax$) Ofcourse, $y$ is a simple but non-linear mapping from $mathcal{N}(lambda)bigcapmathbb{R}_{+}^{N}$. But there is another unitary transformation to be applied after this non-linear transform. What does that all finally mean, is the mystery still!
$endgroup$
– Loves Probability
Mar 23 '15 at 15:22
$begingroup$
By the way, hope you remember me, Dileep! (I am Harish, who joined in ME(Telecom) along with you @IISc. I did my project under Prof. VS (2009-11 batch))
$endgroup$
– Loves Probability
Mar 23 '15 at 15:24
$begingroup$
You are right. It is far from stating the answer. I thought the reverse transformation is obvious. I have now explicitly stated it.
$endgroup$
– dineshdileep
Mar 24 '15 at 5:17
$begingroup$
Yes, I do know you harish. We were neighbors in hostel as well :):).
$endgroup$
– dineshdileep
Mar 24 '15 at 5:18
$begingroup$
Thanks for the solution. This does give a good insight. But $mathcal{N}(lambda)bigcapmathbb{R}_{+}^{N}$ is not the solution of $y$ (but a solution of $ycdot^* y$), let alone desired $x$! (The question actually asks about $x$, as the zeros of the quadratic $x^*Ax$) Ofcourse, $y$ is a simple but non-linear mapping from $mathcal{N}(lambda)bigcapmathbb{R}_{+}^{N}$. But there is another unitary transformation to be applied after this non-linear transform. What does that all finally mean, is the mystery still!
$endgroup$
– Loves Probability
Mar 23 '15 at 15:22
$begingroup$
Thanks for the solution. This does give a good insight. But $mathcal{N}(lambda)bigcapmathbb{R}_{+}^{N}$ is not the solution of $y$ (but a solution of $ycdot^* y$), let alone desired $x$! (The question actually asks about $x$, as the zeros of the quadratic $x^*Ax$) Ofcourse, $y$ is a simple but non-linear mapping from $mathcal{N}(lambda)bigcapmathbb{R}_{+}^{N}$. But there is another unitary transformation to be applied after this non-linear transform. What does that all finally mean, is the mystery still!
$endgroup$
– Loves Probability
Mar 23 '15 at 15:22
$begingroup$
By the way, hope you remember me, Dileep! (I am Harish, who joined in ME(Telecom) along with you @IISc. I did my project under Prof. VS (2009-11 batch))
$endgroup$
– Loves Probability
Mar 23 '15 at 15:24
$begingroup$
By the way, hope you remember me, Dileep! (I am Harish, who joined in ME(Telecom) along with you @IISc. I did my project under Prof. VS (2009-11 batch))
$endgroup$
– Loves Probability
Mar 23 '15 at 15:24
$begingroup$
You are right. It is far from stating the answer. I thought the reverse transformation is obvious. I have now explicitly stated it.
$endgroup$
– dineshdileep
Mar 24 '15 at 5:17
$begingroup$
You are right. It is far from stating the answer. I thought the reverse transformation is obvious. I have now explicitly stated it.
$endgroup$
– dineshdileep
Mar 24 '15 at 5:17
$begingroup$
Yes, I do know you harish. We were neighbors in hostel as well :):).
$endgroup$
– dineshdileep
Mar 24 '15 at 5:18
$begingroup$
Yes, I do know you harish. We were neighbors in hostel as well :):).
$endgroup$
– dineshdileep
Mar 24 '15 at 5:18
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f1198427%2fzeros-of-xax-a-quadratic-form%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
$begingroup$
Have you tried solving this in the case that $A$ is diagonal?
$endgroup$
– Omnomnomnom
Mar 20 '15 at 17:21
$begingroup$
In fact, it suffices to solve this in the case that $A$ has elements $pm 1$ and $0$.
$endgroup$
– Omnomnomnom
Mar 20 '15 at 17:23
$begingroup$
@omnomnomnom Thanks for the comment. That's exactly where my solution stage is. This tri element diagonal case gives me an equation with no further insights!
$endgroup$
– Loves Probability
Mar 20 '15 at 23:01
$begingroup$
For example, the solution for the matrix $$pmatrix{1\&1\&&-1}$$ is the cone $x^2+y^2=z^2$. Note that this is not a linear subspace.
$endgroup$
– Omnomnomnom
Mar 21 '15 at 5:07