Minimizing the sum of absolute difference of each point from the average does it minimize the variance?
$begingroup$
I have a problem in which there is a set of $N$ points. Each point has a set of possible weight (lets call them $W_i$ where $i in [1,N])$. My goal is to select the correct weight $W_i$ for each point $i$ so that the variance of the weights is minimized.
If I find the selection of weights $W_i$ so that the function $f=sum_{i=1}^{N} mid W_i - mu mid$, where $mu$ is the mean value of the $W_i$, is minimized, does that $ Rightarrow $ that the variation of the $W_i$ is also minimized?
Is there a way to formally prove it?
Sorry for my bad terminology and use of English.
Any help will be greatly appreciated :)
statistics variance
$endgroup$
add a comment |
$begingroup$
I have a problem in which there is a set of $N$ points. Each point has a set of possible weight (lets call them $W_i$ where $i in [1,N])$. My goal is to select the correct weight $W_i$ for each point $i$ so that the variance of the weights is minimized.
If I find the selection of weights $W_i$ so that the function $f=sum_{i=1}^{N} mid W_i - mu mid$, where $mu$ is the mean value of the $W_i$, is minimized, does that $ Rightarrow $ that the variation of the $W_i$ is also minimized?
Is there a way to formally prove it?
Sorry for my bad terminology and use of English.
Any help will be greatly appreciated :)
statistics variance
$endgroup$
$begingroup$
I'm not sure what's preventing you from simply setting $W_i=1, forall i$. This gives variance $0$ and your $f$ is also $0$. But generally, the variance (which uses the square of differences) is not minimized at the same time as your function (which uses just the absolute value).
$endgroup$
– Ingix
Jan 10 at 11:40
$begingroup$
The set of values that each $W_i$ can take is pre- decided and can not change. Also each $W_i$ has a different set of possible values. My simple thinking was that since the variance is the average of the square then if you minimize the sum of the absolute values then that means that you minimize the sum of the squares as well => to minimize the mean. I am not an expert in statics though and I am afraid I might be missing something.
$endgroup$
– Dimitris Stathis
Jan 11 at 16:10
add a comment |
$begingroup$
I have a problem in which there is a set of $N$ points. Each point has a set of possible weight (lets call them $W_i$ where $i in [1,N])$. My goal is to select the correct weight $W_i$ for each point $i$ so that the variance of the weights is minimized.
If I find the selection of weights $W_i$ so that the function $f=sum_{i=1}^{N} mid W_i - mu mid$, where $mu$ is the mean value of the $W_i$, is minimized, does that $ Rightarrow $ that the variation of the $W_i$ is also minimized?
Is there a way to formally prove it?
Sorry for my bad terminology and use of English.
Any help will be greatly appreciated :)
statistics variance
$endgroup$
I have a problem in which there is a set of $N$ points. Each point has a set of possible weight (lets call them $W_i$ where $i in [1,N])$. My goal is to select the correct weight $W_i$ for each point $i$ so that the variance of the weights is minimized.
If I find the selection of weights $W_i$ so that the function $f=sum_{i=1}^{N} mid W_i - mu mid$, where $mu$ is the mean value of the $W_i$, is minimized, does that $ Rightarrow $ that the variation of the $W_i$ is also minimized?
Is there a way to formally prove it?
Sorry for my bad terminology and use of English.
Any help will be greatly appreciated :)
statistics variance
statistics variance
asked Jan 10 at 11:14
Dimitris StathisDimitris Stathis
1
1
$begingroup$
I'm not sure what's preventing you from simply setting $W_i=1, forall i$. This gives variance $0$ and your $f$ is also $0$. But generally, the variance (which uses the square of differences) is not minimized at the same time as your function (which uses just the absolute value).
$endgroup$
– Ingix
Jan 10 at 11:40
$begingroup$
The set of values that each $W_i$ can take is pre- decided and can not change. Also each $W_i$ has a different set of possible values. My simple thinking was that since the variance is the average of the square then if you minimize the sum of the absolute values then that means that you minimize the sum of the squares as well => to minimize the mean. I am not an expert in statics though and I am afraid I might be missing something.
$endgroup$
– Dimitris Stathis
Jan 11 at 16:10
add a comment |
$begingroup$
I'm not sure what's preventing you from simply setting $W_i=1, forall i$. This gives variance $0$ and your $f$ is also $0$. But generally, the variance (which uses the square of differences) is not minimized at the same time as your function (which uses just the absolute value).
$endgroup$
– Ingix
Jan 10 at 11:40
$begingroup$
The set of values that each $W_i$ can take is pre- decided and can not change. Also each $W_i$ has a different set of possible values. My simple thinking was that since the variance is the average of the square then if you minimize the sum of the absolute values then that means that you minimize the sum of the squares as well => to minimize the mean. I am not an expert in statics though and I am afraid I might be missing something.
$endgroup$
– Dimitris Stathis
Jan 11 at 16:10
$begingroup$
I'm not sure what's preventing you from simply setting $W_i=1, forall i$. This gives variance $0$ and your $f$ is also $0$. But generally, the variance (which uses the square of differences) is not minimized at the same time as your function (which uses just the absolute value).
$endgroup$
– Ingix
Jan 10 at 11:40
$begingroup$
I'm not sure what's preventing you from simply setting $W_i=1, forall i$. This gives variance $0$ and your $f$ is also $0$. But generally, the variance (which uses the square of differences) is not minimized at the same time as your function (which uses just the absolute value).
$endgroup$
– Ingix
Jan 10 at 11:40
$begingroup$
The set of values that each $W_i$ can take is pre- decided and can not change. Also each $W_i$ has a different set of possible values. My simple thinking was that since the variance is the average of the square then if you minimize the sum of the absolute values then that means that you minimize the sum of the squares as well => to minimize the mean. I am not an expert in statics though and I am afraid I might be missing something.
$endgroup$
– Dimitris Stathis
Jan 11 at 16:10
$begingroup$
The set of values that each $W_i$ can take is pre- decided and can not change. Also each $W_i$ has a different set of possible values. My simple thinking was that since the variance is the average of the square then if you minimize the sum of the absolute values then that means that you minimize the sum of the squares as well => to minimize the mean. I am not an expert in statics though and I am afraid I might be missing something.
$endgroup$
– Dimitris Stathis
Jan 11 at 16:10
add a comment |
0
active
oldest
votes
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3068512%2fminimizing-the-sum-of-absolute-difference-of-each-point-from-the-average-does-it%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
0
active
oldest
votes
0
active
oldest
votes
active
oldest
votes
active
oldest
votes
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3068512%2fminimizing-the-sum-of-absolute-difference-of-each-point-from-the-average-does-it%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
$begingroup$
I'm not sure what's preventing you from simply setting $W_i=1, forall i$. This gives variance $0$ and your $f$ is also $0$. But generally, the variance (which uses the square of differences) is not minimized at the same time as your function (which uses just the absolute value).
$endgroup$
– Ingix
Jan 10 at 11:40
$begingroup$
The set of values that each $W_i$ can take is pre- decided and can not change. Also each $W_i$ has a different set of possible values. My simple thinking was that since the variance is the average of the square then if you minimize the sum of the absolute values then that means that you minimize the sum of the squares as well => to minimize the mean. I am not an expert in statics though and I am afraid I might be missing something.
$endgroup$
– Dimitris Stathis
Jan 11 at 16:10