Regression with predictors that are sometimes undefined
$begingroup$
I have a model M that takes input X and outputs Y, and I'm trying to predict how the mean of Y will change as parameters of the model P vary, given a distribution for X and some samples of the mean of Y from runs of the model with given parameter values. Let's say P=(A,B). M is a complex model and I can only run it with certain parameter values and see what the result is. Normally I would just regress the sample means of Y onto the values of (A,B). However, the model has a logical parameter L that can be true or false, and if it is true then there is an additional continuous parameter C. C is hence undefined if L=false. In other words:
Y = M(X; A,B) when L=false
Y = M(X; A,B,C) when L=true
I would like to find the dependence of the mean of Y on A, B, C and L, and it is reasonable to assume that the value of L does not change the dependence of Y on A or B. Doing regressions separately for points with L=false and L=true seems sub-optimal, since otherwise the estimated dependence on A and B would generally differ in each subset.
I thought about regressing onto L and L*C (with L=false corresponding to L=0, and defining L*C as 0 in those cases, so that there is correctly no dependence of the mean of Y on C with L=false). However, I don't think this will generally give the correct dependence on C when L=true, since the points with L*C=0 will affect the estimate of this.
Does anyone have a good idea for what to do?
regression
$endgroup$
add a comment |
$begingroup$
I have a model M that takes input X and outputs Y, and I'm trying to predict how the mean of Y will change as parameters of the model P vary, given a distribution for X and some samples of the mean of Y from runs of the model with given parameter values. Let's say P=(A,B). M is a complex model and I can only run it with certain parameter values and see what the result is. Normally I would just regress the sample means of Y onto the values of (A,B). However, the model has a logical parameter L that can be true or false, and if it is true then there is an additional continuous parameter C. C is hence undefined if L=false. In other words:
Y = M(X; A,B) when L=false
Y = M(X; A,B,C) when L=true
I would like to find the dependence of the mean of Y on A, B, C and L, and it is reasonable to assume that the value of L does not change the dependence of Y on A or B. Doing regressions separately for points with L=false and L=true seems sub-optimal, since otherwise the estimated dependence on A and B would generally differ in each subset.
I thought about regressing onto L and L*C (with L=false corresponding to L=0, and defining L*C as 0 in those cases, so that there is correctly no dependence of the mean of Y on C with L=false). However, I don't think this will generally give the correct dependence on C when L=true, since the points with L*C=0 will affect the estimate of this.
Does anyone have a good idea for what to do?
regression
$endgroup$
add a comment |
$begingroup$
I have a model M that takes input X and outputs Y, and I'm trying to predict how the mean of Y will change as parameters of the model P vary, given a distribution for X and some samples of the mean of Y from runs of the model with given parameter values. Let's say P=(A,B). M is a complex model and I can only run it with certain parameter values and see what the result is. Normally I would just regress the sample means of Y onto the values of (A,B). However, the model has a logical parameter L that can be true or false, and if it is true then there is an additional continuous parameter C. C is hence undefined if L=false. In other words:
Y = M(X; A,B) when L=false
Y = M(X; A,B,C) when L=true
I would like to find the dependence of the mean of Y on A, B, C and L, and it is reasonable to assume that the value of L does not change the dependence of Y on A or B. Doing regressions separately for points with L=false and L=true seems sub-optimal, since otherwise the estimated dependence on A and B would generally differ in each subset.
I thought about regressing onto L and L*C (with L=false corresponding to L=0, and defining L*C as 0 in those cases, so that there is correctly no dependence of the mean of Y on C with L=false). However, I don't think this will generally give the correct dependence on C when L=true, since the points with L*C=0 will affect the estimate of this.
Does anyone have a good idea for what to do?
regression
$endgroup$
I have a model M that takes input X and outputs Y, and I'm trying to predict how the mean of Y will change as parameters of the model P vary, given a distribution for X and some samples of the mean of Y from runs of the model with given parameter values. Let's say P=(A,B). M is a complex model and I can only run it with certain parameter values and see what the result is. Normally I would just regress the sample means of Y onto the values of (A,B). However, the model has a logical parameter L that can be true or false, and if it is true then there is an additional continuous parameter C. C is hence undefined if L=false. In other words:
Y = M(X; A,B) when L=false
Y = M(X; A,B,C) when L=true
I would like to find the dependence of the mean of Y on A, B, C and L, and it is reasonable to assume that the value of L does not change the dependence of Y on A or B. Doing regressions separately for points with L=false and L=true seems sub-optimal, since otherwise the estimated dependence on A and B would generally differ in each subset.
I thought about regressing onto L and L*C (with L=false corresponding to L=0, and defining L*C as 0 in those cases, so that there is correctly no dependence of the mean of Y on C with L=false). However, I don't think this will generally give the correct dependence on C when L=true, since the points with L*C=0 will affect the estimate of this.
Does anyone have a good idea for what to do?
regression
regression
asked Jan 25 at 16:46
PeterWPeterW
1
1
add a comment |
add a comment |
1 Answer
1
active
oldest
votes
$begingroup$
Actually I take back my statement "I don't think [regressing onto L and L*C] will generally give the correct dependence on C when L=true". If the regression model is
$$ Y = alpha A + beta B + gamma L + delta LC, $$
then $ gamma = Y(X; A,B,C=0,L=mathrm{true})$ and the line of $Y$ plotted against $C$ when $L=mathrm{true}$ will intercept the $Y$ axis at $gamma$ for given $A$ and $B$. So the estimate for the correct value of $delta$ from a standard ordinary least squares regression seems like it should be unbiased (provided the other assumptions made when doing a regression are satisfied, of course).
So I think regressing against $L$ and $LC$ is actually the correct way to do this. This seems to work in simple numerical tests I've done. Do please correct me if I'm wrong.
$endgroup$
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3087314%2fregression-with-predictors-that-are-sometimes-undefined%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
Actually I take back my statement "I don't think [regressing onto L and L*C] will generally give the correct dependence on C when L=true". If the regression model is
$$ Y = alpha A + beta B + gamma L + delta LC, $$
then $ gamma = Y(X; A,B,C=0,L=mathrm{true})$ and the line of $Y$ plotted against $C$ when $L=mathrm{true}$ will intercept the $Y$ axis at $gamma$ for given $A$ and $B$. So the estimate for the correct value of $delta$ from a standard ordinary least squares regression seems like it should be unbiased (provided the other assumptions made when doing a regression are satisfied, of course).
So I think regressing against $L$ and $LC$ is actually the correct way to do this. This seems to work in simple numerical tests I've done. Do please correct me if I'm wrong.
$endgroup$
add a comment |
$begingroup$
Actually I take back my statement "I don't think [regressing onto L and L*C] will generally give the correct dependence on C when L=true". If the regression model is
$$ Y = alpha A + beta B + gamma L + delta LC, $$
then $ gamma = Y(X; A,B,C=0,L=mathrm{true})$ and the line of $Y$ plotted against $C$ when $L=mathrm{true}$ will intercept the $Y$ axis at $gamma$ for given $A$ and $B$. So the estimate for the correct value of $delta$ from a standard ordinary least squares regression seems like it should be unbiased (provided the other assumptions made when doing a regression are satisfied, of course).
So I think regressing against $L$ and $LC$ is actually the correct way to do this. This seems to work in simple numerical tests I've done. Do please correct me if I'm wrong.
$endgroup$
add a comment |
$begingroup$
Actually I take back my statement "I don't think [regressing onto L and L*C] will generally give the correct dependence on C when L=true". If the regression model is
$$ Y = alpha A + beta B + gamma L + delta LC, $$
then $ gamma = Y(X; A,B,C=0,L=mathrm{true})$ and the line of $Y$ plotted against $C$ when $L=mathrm{true}$ will intercept the $Y$ axis at $gamma$ for given $A$ and $B$. So the estimate for the correct value of $delta$ from a standard ordinary least squares regression seems like it should be unbiased (provided the other assumptions made when doing a regression are satisfied, of course).
So I think regressing against $L$ and $LC$ is actually the correct way to do this. This seems to work in simple numerical tests I've done. Do please correct me if I'm wrong.
$endgroup$
Actually I take back my statement "I don't think [regressing onto L and L*C] will generally give the correct dependence on C when L=true". If the regression model is
$$ Y = alpha A + beta B + gamma L + delta LC, $$
then $ gamma = Y(X; A,B,C=0,L=mathrm{true})$ and the line of $Y$ plotted against $C$ when $L=mathrm{true}$ will intercept the $Y$ axis at $gamma$ for given $A$ and $B$. So the estimate for the correct value of $delta$ from a standard ordinary least squares regression seems like it should be unbiased (provided the other assumptions made when doing a regression are satisfied, of course).
So I think regressing against $L$ and $LC$ is actually the correct way to do this. This seems to work in simple numerical tests I've done. Do please correct me if I'm wrong.
answered Jan 25 at 17:23
PeterWPeterW
1
1
add a comment |
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3087314%2fregression-with-predictors-that-are-sometimes-undefined%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown