Question regarding norms of Cauchy-Schwarz inequality
$begingroup$
I am trying to solve problems related to Cauchy-Schwarz Inequality, but I can't seem to understand why, after performing the inner product on the left side, we don't take the square root of it.
Cauchy's formula is
$$lvert langle u,vranglervert leq lVert urVert lVert vrVert.$$
To find the norm of $u$, we first square the components of $u$, then add them, and finally take the square root of the resultant. Same for the norm of $v$. But for the left side, we just do the inner product. Doesn't the norm imply that the square root of the resultant value must be taken?
linear-algebra inner-product-space cauchy-schwarz-inequality
$endgroup$
add a comment |
$begingroup$
I am trying to solve problems related to Cauchy-Schwarz Inequality, but I can't seem to understand why, after performing the inner product on the left side, we don't take the square root of it.
Cauchy's formula is
$$lvert langle u,vranglervert leq lVert urVert lVert vrVert.$$
To find the norm of $u$, we first square the components of $u$, then add them, and finally take the square root of the resultant. Same for the norm of $v$. But for the left side, we just do the inner product. Doesn't the norm imply that the square root of the resultant value must be taken?
linear-algebra inner-product-space cauchy-schwarz-inequality
$endgroup$
1
$begingroup$
On the left side the bars just denote absolute value, not norm.
$endgroup$
– Ian
Jan 28 at 17:49
add a comment |
$begingroup$
I am trying to solve problems related to Cauchy-Schwarz Inequality, but I can't seem to understand why, after performing the inner product on the left side, we don't take the square root of it.
Cauchy's formula is
$$lvert langle u,vranglervert leq lVert urVert lVert vrVert.$$
To find the norm of $u$, we first square the components of $u$, then add them, and finally take the square root of the resultant. Same for the norm of $v$. But for the left side, we just do the inner product. Doesn't the norm imply that the square root of the resultant value must be taken?
linear-algebra inner-product-space cauchy-schwarz-inequality
$endgroup$
I am trying to solve problems related to Cauchy-Schwarz Inequality, but I can't seem to understand why, after performing the inner product on the left side, we don't take the square root of it.
Cauchy's formula is
$$lvert langle u,vranglervert leq lVert urVert lVert vrVert.$$
To find the norm of $u$, we first square the components of $u$, then add them, and finally take the square root of the resultant. Same for the norm of $v$. But for the left side, we just do the inner product. Doesn't the norm imply that the square root of the resultant value must be taken?
linear-algebra inner-product-space cauchy-schwarz-inequality
linear-algebra inner-product-space cauchy-schwarz-inequality
edited Jan 28 at 19:58
J. W. Tanner
4,0271320
4,0271320
asked Jan 28 at 17:48
JohnySmith12JohnySmith12
413
413
1
$begingroup$
On the left side the bars just denote absolute value, not norm.
$endgroup$
– Ian
Jan 28 at 17:49
add a comment |
1
$begingroup$
On the left side the bars just denote absolute value, not norm.
$endgroup$
– Ian
Jan 28 at 17:49
1
1
$begingroup$
On the left side the bars just denote absolute value, not norm.
$endgroup$
– Ian
Jan 28 at 17:49
$begingroup$
On the left side the bars just denote absolute value, not norm.
$endgroup$
– Ian
Jan 28 at 17:49
add a comment |
1 Answer
1
active
oldest
votes
$begingroup$
One way to remember and see it is that things need to stay homogeneous. (Yes, like in physics.)
Informal explanation:
If $u$ and $v$ were in, say, meters, then the inner product $langle u,vrangle$ is a product, so in meters squared; and on the RHS, both norms are in meters, and so the product of two norms is also in meters squared.
More formally and more to the point, the inequality must remain true if you multiply $u$ by $alpha u$, for any number $alpha$. After all, $alpha u$ is just another vector. Same thing replacing $v$ by $beta v$. So we need
$$
lvert langle alpha u,beta vranglervert leq lVert alpha u rVertcdot lVert beta v rVert qquad forall alpha,beta inmathbb{R} tag{1}
$$
But by (bi)linearity, the LHS is equal to $lvert langle alpha u,beta vranglervert = lvert alphabeta langle u, vranglervert = lvert alphabeta rvert cdot lvert langle u, vranglervert$, while the RHS is equal (by properties of norms) to $lVert alpha u rVertcdot lVert beta v rVert = lvert alphabeta rvert cdot lVert u rVertcdot lVert v rVert$. That's good! The factors $lvert alphabeta rvert$ cancel on both sides in (1).
If you had a square root in the LHS, they would not cancel, and (1) couldn't be true for all $alpha,beta$.
$endgroup$
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3091174%2fquestion-regarding-norms-of-cauchy-schwarz-inequality%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
One way to remember and see it is that things need to stay homogeneous. (Yes, like in physics.)
Informal explanation:
If $u$ and $v$ were in, say, meters, then the inner product $langle u,vrangle$ is a product, so in meters squared; and on the RHS, both norms are in meters, and so the product of two norms is also in meters squared.
More formally and more to the point, the inequality must remain true if you multiply $u$ by $alpha u$, for any number $alpha$. After all, $alpha u$ is just another vector. Same thing replacing $v$ by $beta v$. So we need
$$
lvert langle alpha u,beta vranglervert leq lVert alpha u rVertcdot lVert beta v rVert qquad forall alpha,beta inmathbb{R} tag{1}
$$
But by (bi)linearity, the LHS is equal to $lvert langle alpha u,beta vranglervert = lvert alphabeta langle u, vranglervert = lvert alphabeta rvert cdot lvert langle u, vranglervert$, while the RHS is equal (by properties of norms) to $lVert alpha u rVertcdot lVert beta v rVert = lvert alphabeta rvert cdot lVert u rVertcdot lVert v rVert$. That's good! The factors $lvert alphabeta rvert$ cancel on both sides in (1).
If you had a square root in the LHS, they would not cancel, and (1) couldn't be true for all $alpha,beta$.
$endgroup$
add a comment |
$begingroup$
One way to remember and see it is that things need to stay homogeneous. (Yes, like in physics.)
Informal explanation:
If $u$ and $v$ were in, say, meters, then the inner product $langle u,vrangle$ is a product, so in meters squared; and on the RHS, both norms are in meters, and so the product of two norms is also in meters squared.
More formally and more to the point, the inequality must remain true if you multiply $u$ by $alpha u$, for any number $alpha$. After all, $alpha u$ is just another vector. Same thing replacing $v$ by $beta v$. So we need
$$
lvert langle alpha u,beta vranglervert leq lVert alpha u rVertcdot lVert beta v rVert qquad forall alpha,beta inmathbb{R} tag{1}
$$
But by (bi)linearity, the LHS is equal to $lvert langle alpha u,beta vranglervert = lvert alphabeta langle u, vranglervert = lvert alphabeta rvert cdot lvert langle u, vranglervert$, while the RHS is equal (by properties of norms) to $lVert alpha u rVertcdot lVert beta v rVert = lvert alphabeta rvert cdot lVert u rVertcdot lVert v rVert$. That's good! The factors $lvert alphabeta rvert$ cancel on both sides in (1).
If you had a square root in the LHS, they would not cancel, and (1) couldn't be true for all $alpha,beta$.
$endgroup$
add a comment |
$begingroup$
One way to remember and see it is that things need to stay homogeneous. (Yes, like in physics.)
Informal explanation:
If $u$ and $v$ were in, say, meters, then the inner product $langle u,vrangle$ is a product, so in meters squared; and on the RHS, both norms are in meters, and so the product of two norms is also in meters squared.
More formally and more to the point, the inequality must remain true if you multiply $u$ by $alpha u$, for any number $alpha$. After all, $alpha u$ is just another vector. Same thing replacing $v$ by $beta v$. So we need
$$
lvert langle alpha u,beta vranglervert leq lVert alpha u rVertcdot lVert beta v rVert qquad forall alpha,beta inmathbb{R} tag{1}
$$
But by (bi)linearity, the LHS is equal to $lvert langle alpha u,beta vranglervert = lvert alphabeta langle u, vranglervert = lvert alphabeta rvert cdot lvert langle u, vranglervert$, while the RHS is equal (by properties of norms) to $lVert alpha u rVertcdot lVert beta v rVert = lvert alphabeta rvert cdot lVert u rVertcdot lVert v rVert$. That's good! The factors $lvert alphabeta rvert$ cancel on both sides in (1).
If you had a square root in the LHS, they would not cancel, and (1) couldn't be true for all $alpha,beta$.
$endgroup$
One way to remember and see it is that things need to stay homogeneous. (Yes, like in physics.)
Informal explanation:
If $u$ and $v$ were in, say, meters, then the inner product $langle u,vrangle$ is a product, so in meters squared; and on the RHS, both norms are in meters, and so the product of two norms is also in meters squared.
More formally and more to the point, the inequality must remain true if you multiply $u$ by $alpha u$, for any number $alpha$. After all, $alpha u$ is just another vector. Same thing replacing $v$ by $beta v$. So we need
$$
lvert langle alpha u,beta vranglervert leq lVert alpha u rVertcdot lVert beta v rVert qquad forall alpha,beta inmathbb{R} tag{1}
$$
But by (bi)linearity, the LHS is equal to $lvert langle alpha u,beta vranglervert = lvert alphabeta langle u, vranglervert = lvert alphabeta rvert cdot lvert langle u, vranglervert$, while the RHS is equal (by properties of norms) to $lVert alpha u rVertcdot lVert beta v rVert = lvert alphabeta rvert cdot lVert u rVertcdot lVert v rVert$. That's good! The factors $lvert alphabeta rvert$ cancel on both sides in (1).
If you had a square root in the LHS, they would not cancel, and (1) couldn't be true for all $alpha,beta$.
edited Jan 28 at 19:13
J. W. Tanner
4,0271320
4,0271320
answered Jan 28 at 17:57


Clement C.Clement C.
51k34093
51k34093
add a comment |
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3091174%2fquestion-regarding-norms-of-cauchy-schwarz-inequality%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
1
$begingroup$
On the left side the bars just denote absolute value, not norm.
$endgroup$
– Ian
Jan 28 at 17:49