linearization of minimum function
$begingroup$
I need to be advised in linear programming design. I have spent on this issue some time, I have reached my goal partially, but I am blocked by my lack of knowledge in some areas to go further.
Namely, I need to optimize linear function:
$f({x}_1, {x}_2,...,{x}_n) = {c}_1{x}_1+{c}_2{x}_2+...+{c}_n{x}_n$, where ${x}_1, {x}_2,...,x_n in [0,1]$.
Like you can see, it is ordinary linear function. Not trivial requirement is that, some of optimal solutions are more preferred than another one. It means, if result of optimization process is:
${x}_1 = 0, {x}_2 = 0.2, {x}_3=0.8, {x}_4 = 0.1$,
such solution is equivalent to more preferred solution:
${x}_1 = 1, {x}_2 = 0.1, {x}_3=0.0, {x}_4 = 0.0$.
It means, I prefer solutions as following.
Let's denote $\$ ${s} = sum_1^n{x}_i, s in [0, s]$, (sum of all ${x}_i$) and $r= mod {s}$, (fraction part of $s$) then: $Biggl{{{x}_i=1, ileq k\{x}_i=r, {i=lfloor s rfloor+1}\{x}_i=0, otherwise}$
Examples:
If $s=2.1$ then ${x}_1=1, {x}_2=1, {x}_3=0.1$ and ${x}_4,...,{x}_n=0$, if $s=3.8$ then ${x}_1=1, {x}_2=1, {x}_3=1, {x}_4=0.8$ and ${x}_5,...,{x}_n=0\$
I have figured out that such solutions can be forced by involving constraint using minimum function:
${x}_i=min(1, s-sum_1^{j=i-1}{x}_j)$ (minimum of 1 and sum of all x'es before ${x}_i$) but such constraint is unfortunately not linear. I was trying of trick:
${x}_i=min(1, s-sum_1^{j=i-1}{x}_j) = frac12(1+s-sum_1^{j=i-1}{x}_j)-frac12|1-s-sum_1^{j=i-1}{x}_j|$ but modulus function produces disjoint intervals of feasible solutions. Maybe I am not seeing something, but is seems I need help.
ps. I have to use "pure" simplex solver therefore all what I'am using, must be linear and continous or "tricked".
Can anyone help with it?
optimization linear-programming simplex linearization
$endgroup$
add a comment |
$begingroup$
I need to be advised in linear programming design. I have spent on this issue some time, I have reached my goal partially, but I am blocked by my lack of knowledge in some areas to go further.
Namely, I need to optimize linear function:
$f({x}_1, {x}_2,...,{x}_n) = {c}_1{x}_1+{c}_2{x}_2+...+{c}_n{x}_n$, where ${x}_1, {x}_2,...,x_n in [0,1]$.
Like you can see, it is ordinary linear function. Not trivial requirement is that, some of optimal solutions are more preferred than another one. It means, if result of optimization process is:
${x}_1 = 0, {x}_2 = 0.2, {x}_3=0.8, {x}_4 = 0.1$,
such solution is equivalent to more preferred solution:
${x}_1 = 1, {x}_2 = 0.1, {x}_3=0.0, {x}_4 = 0.0$.
It means, I prefer solutions as following.
Let's denote $\$ ${s} = sum_1^n{x}_i, s in [0, s]$, (sum of all ${x}_i$) and $r= mod {s}$, (fraction part of $s$) then: $Biggl{{{x}_i=1, ileq k\{x}_i=r, {i=lfloor s rfloor+1}\{x}_i=0, otherwise}$
Examples:
If $s=2.1$ then ${x}_1=1, {x}_2=1, {x}_3=0.1$ and ${x}_4,...,{x}_n=0$, if $s=3.8$ then ${x}_1=1, {x}_2=1, {x}_3=1, {x}_4=0.8$ and ${x}_5,...,{x}_n=0\$
I have figured out that such solutions can be forced by involving constraint using minimum function:
${x}_i=min(1, s-sum_1^{j=i-1}{x}_j)$ (minimum of 1 and sum of all x'es before ${x}_i$) but such constraint is unfortunately not linear. I was trying of trick:
${x}_i=min(1, s-sum_1^{j=i-1}{x}_j) = frac12(1+s-sum_1^{j=i-1}{x}_j)-frac12|1-s-sum_1^{j=i-1}{x}_j|$ but modulus function produces disjoint intervals of feasible solutions. Maybe I am not seeing something, but is seems I need help.
ps. I have to use "pure" simplex solver therefore all what I'am using, must be linear and continous or "tricked".
Can anyone help with it?
optimization linear-programming simplex linearization
$endgroup$
$begingroup$
sounds like you can just solve $n$ problems, where in each problem you only optimize the fractional value, you fix the first few variables to $1$, the last variables to $0$.
$endgroup$
– LinAlg
Jan 25 at 16:04
$begingroup$
I suspect you are not really asking the problem you want help with. In minimizing a linear function $f(x_1,x_2,ldots,x_n)$ over a domain $(x_1,x_2,ldots,x_n) in [0,1]^n$ the only possibility is to choose $x_i=0$ for variables with positive coefficients and $x_i=1$ for variables with negative coefficients. Of course this leaves some options where variables have zero coefficients, but you could just as well omit those variables from the problem since they do not affect the value of $f$.
$endgroup$
– hardmath
Jan 25 at 16:50
$begingroup$
I totally agree, in practice my task is much more complicated than I described, there are dozens additional constraints which I did not mention, they do not infuence to problem statement. I just did not want to obfuscate description. Anyway, like I mentioned, there is many solutions of objective function but some of them there are more preffered according to the rules shown in description. Therefore, I need to force simplex to receive such preffered over all possible solutions (probably using some tricks using constraints, additional variables etc.)
$endgroup$
– Peter
Jan 25 at 21:19
add a comment |
$begingroup$
I need to be advised in linear programming design. I have spent on this issue some time, I have reached my goal partially, but I am blocked by my lack of knowledge in some areas to go further.
Namely, I need to optimize linear function:
$f({x}_1, {x}_2,...,{x}_n) = {c}_1{x}_1+{c}_2{x}_2+...+{c}_n{x}_n$, where ${x}_1, {x}_2,...,x_n in [0,1]$.
Like you can see, it is ordinary linear function. Not trivial requirement is that, some of optimal solutions are more preferred than another one. It means, if result of optimization process is:
${x}_1 = 0, {x}_2 = 0.2, {x}_3=0.8, {x}_4 = 0.1$,
such solution is equivalent to more preferred solution:
${x}_1 = 1, {x}_2 = 0.1, {x}_3=0.0, {x}_4 = 0.0$.
It means, I prefer solutions as following.
Let's denote $\$ ${s} = sum_1^n{x}_i, s in [0, s]$, (sum of all ${x}_i$) and $r= mod {s}$, (fraction part of $s$) then: $Biggl{{{x}_i=1, ileq k\{x}_i=r, {i=lfloor s rfloor+1}\{x}_i=0, otherwise}$
Examples:
If $s=2.1$ then ${x}_1=1, {x}_2=1, {x}_3=0.1$ and ${x}_4,...,{x}_n=0$, if $s=3.8$ then ${x}_1=1, {x}_2=1, {x}_3=1, {x}_4=0.8$ and ${x}_5,...,{x}_n=0\$
I have figured out that such solutions can be forced by involving constraint using minimum function:
${x}_i=min(1, s-sum_1^{j=i-1}{x}_j)$ (minimum of 1 and sum of all x'es before ${x}_i$) but such constraint is unfortunately not linear. I was trying of trick:
${x}_i=min(1, s-sum_1^{j=i-1}{x}_j) = frac12(1+s-sum_1^{j=i-1}{x}_j)-frac12|1-s-sum_1^{j=i-1}{x}_j|$ but modulus function produces disjoint intervals of feasible solutions. Maybe I am not seeing something, but is seems I need help.
ps. I have to use "pure" simplex solver therefore all what I'am using, must be linear and continous or "tricked".
Can anyone help with it?
optimization linear-programming simplex linearization
$endgroup$
I need to be advised in linear programming design. I have spent on this issue some time, I have reached my goal partially, but I am blocked by my lack of knowledge in some areas to go further.
Namely, I need to optimize linear function:
$f({x}_1, {x}_2,...,{x}_n) = {c}_1{x}_1+{c}_2{x}_2+...+{c}_n{x}_n$, where ${x}_1, {x}_2,...,x_n in [0,1]$.
Like you can see, it is ordinary linear function. Not trivial requirement is that, some of optimal solutions are more preferred than another one. It means, if result of optimization process is:
${x}_1 = 0, {x}_2 = 0.2, {x}_3=0.8, {x}_4 = 0.1$,
such solution is equivalent to more preferred solution:
${x}_1 = 1, {x}_2 = 0.1, {x}_3=0.0, {x}_4 = 0.0$.
It means, I prefer solutions as following.
Let's denote $\$ ${s} = sum_1^n{x}_i, s in [0, s]$, (sum of all ${x}_i$) and $r= mod {s}$, (fraction part of $s$) then: $Biggl{{{x}_i=1, ileq k\{x}_i=r, {i=lfloor s rfloor+1}\{x}_i=0, otherwise}$
Examples:
If $s=2.1$ then ${x}_1=1, {x}_2=1, {x}_3=0.1$ and ${x}_4,...,{x}_n=0$, if $s=3.8$ then ${x}_1=1, {x}_2=1, {x}_3=1, {x}_4=0.8$ and ${x}_5,...,{x}_n=0\$
I have figured out that such solutions can be forced by involving constraint using minimum function:
${x}_i=min(1, s-sum_1^{j=i-1}{x}_j)$ (minimum of 1 and sum of all x'es before ${x}_i$) but such constraint is unfortunately not linear. I was trying of trick:
${x}_i=min(1, s-sum_1^{j=i-1}{x}_j) = frac12(1+s-sum_1^{j=i-1}{x}_j)-frac12|1-s-sum_1^{j=i-1}{x}_j|$ but modulus function produces disjoint intervals of feasible solutions. Maybe I am not seeing something, but is seems I need help.
ps. I have to use "pure" simplex solver therefore all what I'am using, must be linear and continous or "tricked".
Can anyone help with it?
optimization linear-programming simplex linearization
optimization linear-programming simplex linearization
edited Jan 25 at 16:31
Peter
asked Jan 25 at 15:03


PeterPeter
62
62
$begingroup$
sounds like you can just solve $n$ problems, where in each problem you only optimize the fractional value, you fix the first few variables to $1$, the last variables to $0$.
$endgroup$
– LinAlg
Jan 25 at 16:04
$begingroup$
I suspect you are not really asking the problem you want help with. In minimizing a linear function $f(x_1,x_2,ldots,x_n)$ over a domain $(x_1,x_2,ldots,x_n) in [0,1]^n$ the only possibility is to choose $x_i=0$ for variables with positive coefficients and $x_i=1$ for variables with negative coefficients. Of course this leaves some options where variables have zero coefficients, but you could just as well omit those variables from the problem since they do not affect the value of $f$.
$endgroup$
– hardmath
Jan 25 at 16:50
$begingroup$
I totally agree, in practice my task is much more complicated than I described, there are dozens additional constraints which I did not mention, they do not infuence to problem statement. I just did not want to obfuscate description. Anyway, like I mentioned, there is many solutions of objective function but some of them there are more preffered according to the rules shown in description. Therefore, I need to force simplex to receive such preffered over all possible solutions (probably using some tricks using constraints, additional variables etc.)
$endgroup$
– Peter
Jan 25 at 21:19
add a comment |
$begingroup$
sounds like you can just solve $n$ problems, where in each problem you only optimize the fractional value, you fix the first few variables to $1$, the last variables to $0$.
$endgroup$
– LinAlg
Jan 25 at 16:04
$begingroup$
I suspect you are not really asking the problem you want help with. In minimizing a linear function $f(x_1,x_2,ldots,x_n)$ over a domain $(x_1,x_2,ldots,x_n) in [0,1]^n$ the only possibility is to choose $x_i=0$ for variables with positive coefficients and $x_i=1$ for variables with negative coefficients. Of course this leaves some options where variables have zero coefficients, but you could just as well omit those variables from the problem since they do not affect the value of $f$.
$endgroup$
– hardmath
Jan 25 at 16:50
$begingroup$
I totally agree, in practice my task is much more complicated than I described, there are dozens additional constraints which I did not mention, they do not infuence to problem statement. I just did not want to obfuscate description. Anyway, like I mentioned, there is many solutions of objective function but some of them there are more preffered according to the rules shown in description. Therefore, I need to force simplex to receive such preffered over all possible solutions (probably using some tricks using constraints, additional variables etc.)
$endgroup$
– Peter
Jan 25 at 21:19
$begingroup$
sounds like you can just solve $n$ problems, where in each problem you only optimize the fractional value, you fix the first few variables to $1$, the last variables to $0$.
$endgroup$
– LinAlg
Jan 25 at 16:04
$begingroup$
sounds like you can just solve $n$ problems, where in each problem you only optimize the fractional value, you fix the first few variables to $1$, the last variables to $0$.
$endgroup$
– LinAlg
Jan 25 at 16:04
$begingroup$
I suspect you are not really asking the problem you want help with. In minimizing a linear function $f(x_1,x_2,ldots,x_n)$ over a domain $(x_1,x_2,ldots,x_n) in [0,1]^n$ the only possibility is to choose $x_i=0$ for variables with positive coefficients and $x_i=1$ for variables with negative coefficients. Of course this leaves some options where variables have zero coefficients, but you could just as well omit those variables from the problem since they do not affect the value of $f$.
$endgroup$
– hardmath
Jan 25 at 16:50
$begingroup$
I suspect you are not really asking the problem you want help with. In minimizing a linear function $f(x_1,x_2,ldots,x_n)$ over a domain $(x_1,x_2,ldots,x_n) in [0,1]^n$ the only possibility is to choose $x_i=0$ for variables with positive coefficients and $x_i=1$ for variables with negative coefficients. Of course this leaves some options where variables have zero coefficients, but you could just as well omit those variables from the problem since they do not affect the value of $f$.
$endgroup$
– hardmath
Jan 25 at 16:50
$begingroup$
I totally agree, in practice my task is much more complicated than I described, there are dozens additional constraints which I did not mention, they do not infuence to problem statement. I just did not want to obfuscate description. Anyway, like I mentioned, there is many solutions of objective function but some of them there are more preffered according to the rules shown in description. Therefore, I need to force simplex to receive such preffered over all possible solutions (probably using some tricks using constraints, additional variables etc.)
$endgroup$
– Peter
Jan 25 at 21:19
$begingroup$
I totally agree, in practice my task is much more complicated than I described, there are dozens additional constraints which I did not mention, they do not infuence to problem statement. I just did not want to obfuscate description. Anyway, like I mentioned, there is many solutions of objective function but some of them there are more preffered according to the rules shown in description. Therefore, I need to force simplex to receive such preffered over all possible solutions (probably using some tricks using constraints, additional variables etc.)
$endgroup$
– Peter
Jan 25 at 21:19
add a comment |
1 Answer
1
active
oldest
votes
$begingroup$
As @LinAlg says, optimizing over solutions of the "preferred" is basically the same as optimizing in a single variable. So you just need to find optimal solution and then compare it to the best preferred solution. This is true regardless of what your objective function is.
That said, with a linear objective function, you can easily compute the best "preferred" solution since,
$$
f(1,ldots, 1, r, 0, ldots, 0)
= rc_k + sum_{i=1}^{k} c_i
$$
Depending on if you are maximizing or minimizing, and if $c_k$ is positive or negative it is best to just pick $r=0$ or $r=1$ (unless $c_k=0$ in which case you can pick anything).
More generally, since you're optimizing a linear function over a polygon you will always be able to find a solution on one of the vertices. In your case the polygon is just a cube, which makes finding vertices easy. This means you can easily check the signs of $c_i$ to determine an optimal solution, and then see if one of the preferred form exists or not.
$endgroup$
$begingroup$
More general terms are polytope and hypercube for the problem described by OP.
$endgroup$
– hardmath
Jan 25 at 17:16
$begingroup$
Thank you for the response, I agree with both of you. Your suggestions are very similar to my first approach which I directed to. But there is worth to mention some details which you notoced as well. Using such approach I have to run simplex n times where n is a number of variables.Variable count in my task is 60 or more. In fact, executing simplex for 1 variable is not time consuming but building all data structures required to run simplex 60 times seems not optimal, especially if I have to do it a few times per second (I am doing it on the fly of some process in software).
$endgroup$
– Peter
Jan 25 at 21:21
$begingroup$
Executing such linear program on the fly for 60 variables takes a few microseconds only, where building of all necessary data structures takes c.a. 80% of it. My model usually is solved in 5-7 pivots of simplex table. Suggested approach results in increasing execution time hundred percents time. Of course, if there is no another solution, I have to deal with it, but maybe someone is able to propose solution to solve this task in "one shot" simplex execution. What do you think,is it possible?
$endgroup$
– Peter
Jan 25 at 21:21
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3087193%2flinearization-of-minimum-function%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
As @LinAlg says, optimizing over solutions of the "preferred" is basically the same as optimizing in a single variable. So you just need to find optimal solution and then compare it to the best preferred solution. This is true regardless of what your objective function is.
That said, with a linear objective function, you can easily compute the best "preferred" solution since,
$$
f(1,ldots, 1, r, 0, ldots, 0)
= rc_k + sum_{i=1}^{k} c_i
$$
Depending on if you are maximizing or minimizing, and if $c_k$ is positive or negative it is best to just pick $r=0$ or $r=1$ (unless $c_k=0$ in which case you can pick anything).
More generally, since you're optimizing a linear function over a polygon you will always be able to find a solution on one of the vertices. In your case the polygon is just a cube, which makes finding vertices easy. This means you can easily check the signs of $c_i$ to determine an optimal solution, and then see if one of the preferred form exists or not.
$endgroup$
$begingroup$
More general terms are polytope and hypercube for the problem described by OP.
$endgroup$
– hardmath
Jan 25 at 17:16
$begingroup$
Thank you for the response, I agree with both of you. Your suggestions are very similar to my first approach which I directed to. But there is worth to mention some details which you notoced as well. Using such approach I have to run simplex n times where n is a number of variables.Variable count in my task is 60 or more. In fact, executing simplex for 1 variable is not time consuming but building all data structures required to run simplex 60 times seems not optimal, especially if I have to do it a few times per second (I am doing it on the fly of some process in software).
$endgroup$
– Peter
Jan 25 at 21:21
$begingroup$
Executing such linear program on the fly for 60 variables takes a few microseconds only, where building of all necessary data structures takes c.a. 80% of it. My model usually is solved in 5-7 pivots of simplex table. Suggested approach results in increasing execution time hundred percents time. Of course, if there is no another solution, I have to deal with it, but maybe someone is able to propose solution to solve this task in "one shot" simplex execution. What do you think,is it possible?
$endgroup$
– Peter
Jan 25 at 21:21
add a comment |
$begingroup$
As @LinAlg says, optimizing over solutions of the "preferred" is basically the same as optimizing in a single variable. So you just need to find optimal solution and then compare it to the best preferred solution. This is true regardless of what your objective function is.
That said, with a linear objective function, you can easily compute the best "preferred" solution since,
$$
f(1,ldots, 1, r, 0, ldots, 0)
= rc_k + sum_{i=1}^{k} c_i
$$
Depending on if you are maximizing or minimizing, and if $c_k$ is positive or negative it is best to just pick $r=0$ or $r=1$ (unless $c_k=0$ in which case you can pick anything).
More generally, since you're optimizing a linear function over a polygon you will always be able to find a solution on one of the vertices. In your case the polygon is just a cube, which makes finding vertices easy. This means you can easily check the signs of $c_i$ to determine an optimal solution, and then see if one of the preferred form exists or not.
$endgroup$
$begingroup$
More general terms are polytope and hypercube for the problem described by OP.
$endgroup$
– hardmath
Jan 25 at 17:16
$begingroup$
Thank you for the response, I agree with both of you. Your suggestions are very similar to my first approach which I directed to. But there is worth to mention some details which you notoced as well. Using such approach I have to run simplex n times where n is a number of variables.Variable count in my task is 60 or more. In fact, executing simplex for 1 variable is not time consuming but building all data structures required to run simplex 60 times seems not optimal, especially if I have to do it a few times per second (I am doing it on the fly of some process in software).
$endgroup$
– Peter
Jan 25 at 21:21
$begingroup$
Executing such linear program on the fly for 60 variables takes a few microseconds only, where building of all necessary data structures takes c.a. 80% of it. My model usually is solved in 5-7 pivots of simplex table. Suggested approach results in increasing execution time hundred percents time. Of course, if there is no another solution, I have to deal with it, but maybe someone is able to propose solution to solve this task in "one shot" simplex execution. What do you think,is it possible?
$endgroup$
– Peter
Jan 25 at 21:21
add a comment |
$begingroup$
As @LinAlg says, optimizing over solutions of the "preferred" is basically the same as optimizing in a single variable. So you just need to find optimal solution and then compare it to the best preferred solution. This is true regardless of what your objective function is.
That said, with a linear objective function, you can easily compute the best "preferred" solution since,
$$
f(1,ldots, 1, r, 0, ldots, 0)
= rc_k + sum_{i=1}^{k} c_i
$$
Depending on if you are maximizing or minimizing, and if $c_k$ is positive or negative it is best to just pick $r=0$ or $r=1$ (unless $c_k=0$ in which case you can pick anything).
More generally, since you're optimizing a linear function over a polygon you will always be able to find a solution on one of the vertices. In your case the polygon is just a cube, which makes finding vertices easy. This means you can easily check the signs of $c_i$ to determine an optimal solution, and then see if one of the preferred form exists or not.
$endgroup$
As @LinAlg says, optimizing over solutions of the "preferred" is basically the same as optimizing in a single variable. So you just need to find optimal solution and then compare it to the best preferred solution. This is true regardless of what your objective function is.
That said, with a linear objective function, you can easily compute the best "preferred" solution since,
$$
f(1,ldots, 1, r, 0, ldots, 0)
= rc_k + sum_{i=1}^{k} c_i
$$
Depending on if you are maximizing or minimizing, and if $c_k$ is positive or negative it is best to just pick $r=0$ or $r=1$ (unless $c_k=0$ in which case you can pick anything).
More generally, since you're optimizing a linear function over a polygon you will always be able to find a solution on one of the vertices. In your case the polygon is just a cube, which makes finding vertices easy. This means you can easily check the signs of $c_i$ to determine an optimal solution, and then see if one of the preferred form exists or not.
answered Jan 25 at 16:39
tchtch
813310
813310
$begingroup$
More general terms are polytope and hypercube for the problem described by OP.
$endgroup$
– hardmath
Jan 25 at 17:16
$begingroup$
Thank you for the response, I agree with both of you. Your suggestions are very similar to my first approach which I directed to. But there is worth to mention some details which you notoced as well. Using such approach I have to run simplex n times where n is a number of variables.Variable count in my task is 60 or more. In fact, executing simplex for 1 variable is not time consuming but building all data structures required to run simplex 60 times seems not optimal, especially if I have to do it a few times per second (I am doing it on the fly of some process in software).
$endgroup$
– Peter
Jan 25 at 21:21
$begingroup$
Executing such linear program on the fly for 60 variables takes a few microseconds only, where building of all necessary data structures takes c.a. 80% of it. My model usually is solved in 5-7 pivots of simplex table. Suggested approach results in increasing execution time hundred percents time. Of course, if there is no another solution, I have to deal with it, but maybe someone is able to propose solution to solve this task in "one shot" simplex execution. What do you think,is it possible?
$endgroup$
– Peter
Jan 25 at 21:21
add a comment |
$begingroup$
More general terms are polytope and hypercube for the problem described by OP.
$endgroup$
– hardmath
Jan 25 at 17:16
$begingroup$
Thank you for the response, I agree with both of you. Your suggestions are very similar to my first approach which I directed to. But there is worth to mention some details which you notoced as well. Using such approach I have to run simplex n times where n is a number of variables.Variable count in my task is 60 or more. In fact, executing simplex for 1 variable is not time consuming but building all data structures required to run simplex 60 times seems not optimal, especially if I have to do it a few times per second (I am doing it on the fly of some process in software).
$endgroup$
– Peter
Jan 25 at 21:21
$begingroup$
Executing such linear program on the fly for 60 variables takes a few microseconds only, where building of all necessary data structures takes c.a. 80% of it. My model usually is solved in 5-7 pivots of simplex table. Suggested approach results in increasing execution time hundred percents time. Of course, if there is no another solution, I have to deal with it, but maybe someone is able to propose solution to solve this task in "one shot" simplex execution. What do you think,is it possible?
$endgroup$
– Peter
Jan 25 at 21:21
$begingroup$
More general terms are polytope and hypercube for the problem described by OP.
$endgroup$
– hardmath
Jan 25 at 17:16
$begingroup$
More general terms are polytope and hypercube for the problem described by OP.
$endgroup$
– hardmath
Jan 25 at 17:16
$begingroup$
Thank you for the response, I agree with both of you. Your suggestions are very similar to my first approach which I directed to. But there is worth to mention some details which you notoced as well. Using such approach I have to run simplex n times where n is a number of variables.Variable count in my task is 60 or more. In fact, executing simplex for 1 variable is not time consuming but building all data structures required to run simplex 60 times seems not optimal, especially if I have to do it a few times per second (I am doing it on the fly of some process in software).
$endgroup$
– Peter
Jan 25 at 21:21
$begingroup$
Thank you for the response, I agree with both of you. Your suggestions are very similar to my first approach which I directed to. But there is worth to mention some details which you notoced as well. Using such approach I have to run simplex n times where n is a number of variables.Variable count in my task is 60 or more. In fact, executing simplex for 1 variable is not time consuming but building all data structures required to run simplex 60 times seems not optimal, especially if I have to do it a few times per second (I am doing it on the fly of some process in software).
$endgroup$
– Peter
Jan 25 at 21:21
$begingroup$
Executing such linear program on the fly for 60 variables takes a few microseconds only, where building of all necessary data structures takes c.a. 80% of it. My model usually is solved in 5-7 pivots of simplex table. Suggested approach results in increasing execution time hundred percents time. Of course, if there is no another solution, I have to deal with it, but maybe someone is able to propose solution to solve this task in "one shot" simplex execution. What do you think,is it possible?
$endgroup$
– Peter
Jan 25 at 21:21
$begingroup$
Executing such linear program on the fly for 60 variables takes a few microseconds only, where building of all necessary data structures takes c.a. 80% of it. My model usually is solved in 5-7 pivots of simplex table. Suggested approach results in increasing execution time hundred percents time. Of course, if there is no another solution, I have to deal with it, but maybe someone is able to propose solution to solve this task in "one shot" simplex execution. What do you think,is it possible?
$endgroup$
– Peter
Jan 25 at 21:21
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3087193%2flinearization-of-minimum-function%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
$begingroup$
sounds like you can just solve $n$ problems, where in each problem you only optimize the fractional value, you fix the first few variables to $1$, the last variables to $0$.
$endgroup$
– LinAlg
Jan 25 at 16:04
$begingroup$
I suspect you are not really asking the problem you want help with. In minimizing a linear function $f(x_1,x_2,ldots,x_n)$ over a domain $(x_1,x_2,ldots,x_n) in [0,1]^n$ the only possibility is to choose $x_i=0$ for variables with positive coefficients and $x_i=1$ for variables with negative coefficients. Of course this leaves some options where variables have zero coefficients, but you could just as well omit those variables from the problem since they do not affect the value of $f$.
$endgroup$
– hardmath
Jan 25 at 16:50
$begingroup$
I totally agree, in practice my task is much more complicated than I described, there are dozens additional constraints which I did not mention, they do not infuence to problem statement. I just did not want to obfuscate description. Anyway, like I mentioned, there is many solutions of objective function but some of them there are more preffered according to the rules shown in description. Therefore, I need to force simplex to receive such preffered over all possible solutions (probably using some tricks using constraints, additional variables etc.)
$endgroup$
– Peter
Jan 25 at 21:19