Iterative algorithm for a simple linear optimization problemAn Optimization problem related with $(-1)^N-1sum_i=1^Mfracln x_ix_i^Nprod_jneq ifracx_ix_i-x_j$Optimization problem with a minimization sub-problem as a constraintProving boundedness of a function .How is the pivot chosen for the symbolic weights for the Cassowary algorithm?What happens if we remove the non-negativity constraints in a linear programming problem?An optimization problem with strict inequalities for which an optimal solution existsSolving a convex resource allocation problemhow can update parameter in each iteration?Dual of a linear program.Binary Polymatroid Optimization Problem
Can I sign legal documents with a smiley face?
Is a model fitted to data or is data fitted to a model?
How to get the similar sounding words together
On a tidally locked planet, would time be quantized?
Transformation of random variables and joint distributions
Can I Retrieve Email Addresses from BCC?
Will adding a BY-SA image to a blog post make the entire post BY-SA?
How much character growth crosses the line into breaking the character
Are all species of CANNA edible?
Should I stop contributing to retirement accounts?
Can someone explain how this makes sense electrically?
Simulating a probability of 1 of 2^N with less than N random bits
How should I respond when I lied about my education and the company finds out through background check?
Did US corporations pay demonstrators in the German demonstrations against article 13?
How do I repair my stair bannister?
Why did the EU agree to delay the Brexit deadline?
How to color a curve
Did arcade monitors have same pixel aspect ratio as TV sets?
Why do IPv6 unique local addresses have to have a /48 prefix?
Is it possible to use .desktop files to open local pdf files on specific pages with a browser?
Freedom of speech and where it applies
Reply 'no position' while the job posting is still there
What is the grammatical term for “‑ed” words like these?
How do I implement a file system driver driver in Linux?
Iterative algorithm for a simple linear optimization problem
An Optimization problem related with $(-1)^N-1sum_i=1^Mfracln x_ix_i^Nprod_jneq ifracx_ix_i-x_j$Optimization problem with a minimization sub-problem as a constraintProving boundedness of a function .How is the pivot chosen for the symbolic weights for the Cassowary algorithm?What happens if we remove the non-negativity constraints in a linear programming problem?An optimization problem with strict inequalities for which an optimal solution existsSolving a convex resource allocation problemhow can update parameter in each iteration?Dual of a linear program.Binary Polymatroid Optimization Problem
$begingroup$
Let $c_1,dots,c_n$ be $n$ positive numbers and so be $a_1,dots, a_n$. For some $r$ such that $1leq rleq n$, consider the optimization problem
beginalign
max_x_iinmathbbR&~~sum_i=1^rc_ix_i - sum_i=r+1^nc_ix_i \
& sum_i=1^ra_ix_i - sum_i=r+1^na_ix_i leq b \
& 0leq x_i leq 1~,~forall i
endalign
I understand that I can use a linear solver for this, however I am interested in finding a separate algorithm for this. I am using the following approach to solve it. First I initialize all $x_i$ as
beginalign
x_i , = , begincases1 ~~textif~iin1,dots,r\ 0 ~~textif~iinr+1,dots,nendcases
endalign
Note that this is the objective value at the optimum for the unconstrained problem which will be always larger (or equal) than the objective value at optimum of the constrained one. At this point, if this point is feasible, then we are done. Else, note that $c_i_r+1^n$ all have a decreasing effect on both objective and LHS of constraint (making it more likely to satisfy the constraint). Thus, I pick the smallest among these $c_i$ and increases its $x_i$ in $[0,1]$ to see if the constraint becomes feasible while decreasing the objective. If not, set $x_i=1$ and pick the second smallest in $x_i_i=r+1^n$ and continue in this fashion till I exhaust all in that set. Then I move on to the set $c_i_1^r$ and do similar operations there. My question is whether this will find the true solution?
real-analysis linear-algebra optimization linear-programming numerical-optimization
$endgroup$
|
show 3 more comments
$begingroup$
Let $c_1,dots,c_n$ be $n$ positive numbers and so be $a_1,dots, a_n$. For some $r$ such that $1leq rleq n$, consider the optimization problem
beginalign
max_x_iinmathbbR&~~sum_i=1^rc_ix_i - sum_i=r+1^nc_ix_i \
& sum_i=1^ra_ix_i - sum_i=r+1^na_ix_i leq b \
& 0leq x_i leq 1~,~forall i
endalign
I understand that I can use a linear solver for this, however I am interested in finding a separate algorithm for this. I am using the following approach to solve it. First I initialize all $x_i$ as
beginalign
x_i , = , begincases1 ~~textif~iin1,dots,r\ 0 ~~textif~iinr+1,dots,nendcases
endalign
Note that this is the objective value at the optimum for the unconstrained problem which will be always larger (or equal) than the objective value at optimum of the constrained one. At this point, if this point is feasible, then we are done. Else, note that $c_i_r+1^n$ all have a decreasing effect on both objective and LHS of constraint (making it more likely to satisfy the constraint). Thus, I pick the smallest among these $c_i$ and increases its $x_i$ in $[0,1]$ to see if the constraint becomes feasible while decreasing the objective. If not, set $x_i=1$ and pick the second smallest in $x_i_i=r+1^n$ and continue in this fashion till I exhaust all in that set. Then I move on to the set $c_i_1^r$ and do similar operations there. My question is whether this will find the true solution?
real-analysis linear-algebra optimization linear-programming numerical-optimization
$endgroup$
$begingroup$
I believe that constructing a one-dimensional Lagrangian dual problem is your way to construct a specialized algorithm. Construct a Lagrangian with a multiplier $lambda > 0$ for the first inequality constraint, and minimize it subject to $x_i in [0, 1]$. You can maximize it any one-dimensional method you like, such as Golden Section search, and then extract the primal optimal by substiuting $lambda^*$ into the Lagrangian minimization problem.
$endgroup$
– Alex Shtof
Mar 17 at 8:09
$begingroup$
Please clear this doubt. Is skipping the multipliers corresponding to $x_iin [0,1]$ is ok? because they are also constraints, right?
$endgroup$
– dineshdileep
Mar 17 at 12:04
$begingroup$
Yes. Duality works for convex problems of the form $min~f(x)~texts.t.~ g_i(x) leq 0~ x in C$. The dual function is $q(lambda) = min_x in C f(x) + sum_i lambda_i g_i(x) $. It is our choice which constraints we treat as $g_i$ and which constraints we treat as $C$. I chose to treat $x_i in [0, 1]$ as the set $C$, and left with only one constraint which has a multiplier.
$endgroup$
– Alex Shtof
Mar 17 at 12:13
$begingroup$
Got, it. Thanks!.
$endgroup$
– dineshdileep
Mar 17 at 12:34
$begingroup$
@AlexShtof how are KKT conditions satisfied here?
$endgroup$
– dineshdileep
Mar 17 at 21:57
|
show 3 more comments
$begingroup$
Let $c_1,dots,c_n$ be $n$ positive numbers and so be $a_1,dots, a_n$. For some $r$ such that $1leq rleq n$, consider the optimization problem
beginalign
max_x_iinmathbbR&~~sum_i=1^rc_ix_i - sum_i=r+1^nc_ix_i \
& sum_i=1^ra_ix_i - sum_i=r+1^na_ix_i leq b \
& 0leq x_i leq 1~,~forall i
endalign
I understand that I can use a linear solver for this, however I am interested in finding a separate algorithm for this. I am using the following approach to solve it. First I initialize all $x_i$ as
beginalign
x_i , = , begincases1 ~~textif~iin1,dots,r\ 0 ~~textif~iinr+1,dots,nendcases
endalign
Note that this is the objective value at the optimum for the unconstrained problem which will be always larger (or equal) than the objective value at optimum of the constrained one. At this point, if this point is feasible, then we are done. Else, note that $c_i_r+1^n$ all have a decreasing effect on both objective and LHS of constraint (making it more likely to satisfy the constraint). Thus, I pick the smallest among these $c_i$ and increases its $x_i$ in $[0,1]$ to see if the constraint becomes feasible while decreasing the objective. If not, set $x_i=1$ and pick the second smallest in $x_i_i=r+1^n$ and continue in this fashion till I exhaust all in that set. Then I move on to the set $c_i_1^r$ and do similar operations there. My question is whether this will find the true solution?
real-analysis linear-algebra optimization linear-programming numerical-optimization
$endgroup$
Let $c_1,dots,c_n$ be $n$ positive numbers and so be $a_1,dots, a_n$. For some $r$ such that $1leq rleq n$, consider the optimization problem
beginalign
max_x_iinmathbbR&~~sum_i=1^rc_ix_i - sum_i=r+1^nc_ix_i \
& sum_i=1^ra_ix_i - sum_i=r+1^na_ix_i leq b \
& 0leq x_i leq 1~,~forall i
endalign
I understand that I can use a linear solver for this, however I am interested in finding a separate algorithm for this. I am using the following approach to solve it. First I initialize all $x_i$ as
beginalign
x_i , = , begincases1 ~~textif~iin1,dots,r\ 0 ~~textif~iinr+1,dots,nendcases
endalign
Note that this is the objective value at the optimum for the unconstrained problem which will be always larger (or equal) than the objective value at optimum of the constrained one. At this point, if this point is feasible, then we are done. Else, note that $c_i_r+1^n$ all have a decreasing effect on both objective and LHS of constraint (making it more likely to satisfy the constraint). Thus, I pick the smallest among these $c_i$ and increases its $x_i$ in $[0,1]$ to see if the constraint becomes feasible while decreasing the objective. If not, set $x_i=1$ and pick the second smallest in $x_i_i=r+1^n$ and continue in this fashion till I exhaust all in that set. Then I move on to the set $c_i_1^r$ and do similar operations there. My question is whether this will find the true solution?
real-analysis linear-algebra optimization linear-programming numerical-optimization
real-analysis linear-algebra optimization linear-programming numerical-optimization
edited Mar 16 at 12:01
Rodrigo de Azevedo
13.2k41960
13.2k41960
asked Mar 16 at 11:58
dineshdileepdineshdileep
6,00111935
6,00111935
$begingroup$
I believe that constructing a one-dimensional Lagrangian dual problem is your way to construct a specialized algorithm. Construct a Lagrangian with a multiplier $lambda > 0$ for the first inequality constraint, and minimize it subject to $x_i in [0, 1]$. You can maximize it any one-dimensional method you like, such as Golden Section search, and then extract the primal optimal by substiuting $lambda^*$ into the Lagrangian minimization problem.
$endgroup$
– Alex Shtof
Mar 17 at 8:09
$begingroup$
Please clear this doubt. Is skipping the multipliers corresponding to $x_iin [0,1]$ is ok? because they are also constraints, right?
$endgroup$
– dineshdileep
Mar 17 at 12:04
$begingroup$
Yes. Duality works for convex problems of the form $min~f(x)~texts.t.~ g_i(x) leq 0~ x in C$. The dual function is $q(lambda) = min_x in C f(x) + sum_i lambda_i g_i(x) $. It is our choice which constraints we treat as $g_i$ and which constraints we treat as $C$. I chose to treat $x_i in [0, 1]$ as the set $C$, and left with only one constraint which has a multiplier.
$endgroup$
– Alex Shtof
Mar 17 at 12:13
$begingroup$
Got, it. Thanks!.
$endgroup$
– dineshdileep
Mar 17 at 12:34
$begingroup$
@AlexShtof how are KKT conditions satisfied here?
$endgroup$
– dineshdileep
Mar 17 at 21:57
|
show 3 more comments
$begingroup$
I believe that constructing a one-dimensional Lagrangian dual problem is your way to construct a specialized algorithm. Construct a Lagrangian with a multiplier $lambda > 0$ for the first inequality constraint, and minimize it subject to $x_i in [0, 1]$. You can maximize it any one-dimensional method you like, such as Golden Section search, and then extract the primal optimal by substiuting $lambda^*$ into the Lagrangian minimization problem.
$endgroup$
– Alex Shtof
Mar 17 at 8:09
$begingroup$
Please clear this doubt. Is skipping the multipliers corresponding to $x_iin [0,1]$ is ok? because they are also constraints, right?
$endgroup$
– dineshdileep
Mar 17 at 12:04
$begingroup$
Yes. Duality works for convex problems of the form $min~f(x)~texts.t.~ g_i(x) leq 0~ x in C$. The dual function is $q(lambda) = min_x in C f(x) + sum_i lambda_i g_i(x) $. It is our choice which constraints we treat as $g_i$ and which constraints we treat as $C$. I chose to treat $x_i in [0, 1]$ as the set $C$, and left with only one constraint which has a multiplier.
$endgroup$
– Alex Shtof
Mar 17 at 12:13
$begingroup$
Got, it. Thanks!.
$endgroup$
– dineshdileep
Mar 17 at 12:34
$begingroup$
@AlexShtof how are KKT conditions satisfied here?
$endgroup$
– dineshdileep
Mar 17 at 21:57
$begingroup$
I believe that constructing a one-dimensional Lagrangian dual problem is your way to construct a specialized algorithm. Construct a Lagrangian with a multiplier $lambda > 0$ for the first inequality constraint, and minimize it subject to $x_i in [0, 1]$. You can maximize it any one-dimensional method you like, such as Golden Section search, and then extract the primal optimal by substiuting $lambda^*$ into the Lagrangian minimization problem.
$endgroup$
– Alex Shtof
Mar 17 at 8:09
$begingroup$
I believe that constructing a one-dimensional Lagrangian dual problem is your way to construct a specialized algorithm. Construct a Lagrangian with a multiplier $lambda > 0$ for the first inequality constraint, and minimize it subject to $x_i in [0, 1]$. You can maximize it any one-dimensional method you like, such as Golden Section search, and then extract the primal optimal by substiuting $lambda^*$ into the Lagrangian minimization problem.
$endgroup$
– Alex Shtof
Mar 17 at 8:09
$begingroup$
Please clear this doubt. Is skipping the multipliers corresponding to $x_iin [0,1]$ is ok? because they are also constraints, right?
$endgroup$
– dineshdileep
Mar 17 at 12:04
$begingroup$
Please clear this doubt. Is skipping the multipliers corresponding to $x_iin [0,1]$ is ok? because they are also constraints, right?
$endgroup$
– dineshdileep
Mar 17 at 12:04
$begingroup$
Yes. Duality works for convex problems of the form $min~f(x)~texts.t.~ g_i(x) leq 0~ x in C$. The dual function is $q(lambda) = min_x in C f(x) + sum_i lambda_i g_i(x) $. It is our choice which constraints we treat as $g_i$ and which constraints we treat as $C$. I chose to treat $x_i in [0, 1]$ as the set $C$, and left with only one constraint which has a multiplier.
$endgroup$
– Alex Shtof
Mar 17 at 12:13
$begingroup$
Yes. Duality works for convex problems of the form $min~f(x)~texts.t.~ g_i(x) leq 0~ x in C$. The dual function is $q(lambda) = min_x in C f(x) + sum_i lambda_i g_i(x) $. It is our choice which constraints we treat as $g_i$ and which constraints we treat as $C$. I chose to treat $x_i in [0, 1]$ as the set $C$, and left with only one constraint which has a multiplier.
$endgroup$
– Alex Shtof
Mar 17 at 12:13
$begingroup$
Got, it. Thanks!.
$endgroup$
– dineshdileep
Mar 17 at 12:34
$begingroup$
Got, it. Thanks!.
$endgroup$
– dineshdileep
Mar 17 at 12:34
$begingroup$
@AlexShtof how are KKT conditions satisfied here?
$endgroup$
– dineshdileep
Mar 17 at 21:57
$begingroup$
@AlexShtof how are KKT conditions satisfied here?
$endgroup$
– dineshdileep
Mar 17 at 21:57
|
show 3 more comments
1 Answer
1
active
oldest
votes
$begingroup$
This algorithm is suboptimal. Suppose $a_i=0$ then you change $x_i$ unnecessarily. If you consider the ratio of $c_i$ and $a_i$, you almost mimicked the simplex method.
$endgroup$
$begingroup$
Can you explain the part on simplex method more?
$endgroup$
– dineshdileep
Mar 16 at 13:54
$begingroup$
@dineshdileep I would need some time to work out the details and I am not 100% certain it boils down to the same result, but the ratio test in the dual problem considers the same ratios.
$endgroup$
– LinAlg
Mar 16 at 14:20
add a comment |
Your Answer
StackExchange.ifUsing("editor", function ()
return StackExchange.using("mathjaxEditing", function ()
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
);
);
, "mathjax-editing");
StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "69"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);
else
createEditor();
);
function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);
);
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3150316%2fiterative-algorithm-for-a-simple-linear-optimization-problem%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
This algorithm is suboptimal. Suppose $a_i=0$ then you change $x_i$ unnecessarily. If you consider the ratio of $c_i$ and $a_i$, you almost mimicked the simplex method.
$endgroup$
$begingroup$
Can you explain the part on simplex method more?
$endgroup$
– dineshdileep
Mar 16 at 13:54
$begingroup$
@dineshdileep I would need some time to work out the details and I am not 100% certain it boils down to the same result, but the ratio test in the dual problem considers the same ratios.
$endgroup$
– LinAlg
Mar 16 at 14:20
add a comment |
$begingroup$
This algorithm is suboptimal. Suppose $a_i=0$ then you change $x_i$ unnecessarily. If you consider the ratio of $c_i$ and $a_i$, you almost mimicked the simplex method.
$endgroup$
$begingroup$
Can you explain the part on simplex method more?
$endgroup$
– dineshdileep
Mar 16 at 13:54
$begingroup$
@dineshdileep I would need some time to work out the details and I am not 100% certain it boils down to the same result, but the ratio test in the dual problem considers the same ratios.
$endgroup$
– LinAlg
Mar 16 at 14:20
add a comment |
$begingroup$
This algorithm is suboptimal. Suppose $a_i=0$ then you change $x_i$ unnecessarily. If you consider the ratio of $c_i$ and $a_i$, you almost mimicked the simplex method.
$endgroup$
This algorithm is suboptimal. Suppose $a_i=0$ then you change $x_i$ unnecessarily. If you consider the ratio of $c_i$ and $a_i$, you almost mimicked the simplex method.
answered Mar 16 at 13:37
LinAlgLinAlg
10.1k1521
10.1k1521
$begingroup$
Can you explain the part on simplex method more?
$endgroup$
– dineshdileep
Mar 16 at 13:54
$begingroup$
@dineshdileep I would need some time to work out the details and I am not 100% certain it boils down to the same result, but the ratio test in the dual problem considers the same ratios.
$endgroup$
– LinAlg
Mar 16 at 14:20
add a comment |
$begingroup$
Can you explain the part on simplex method more?
$endgroup$
– dineshdileep
Mar 16 at 13:54
$begingroup$
@dineshdileep I would need some time to work out the details and I am not 100% certain it boils down to the same result, but the ratio test in the dual problem considers the same ratios.
$endgroup$
– LinAlg
Mar 16 at 14:20
$begingroup$
Can you explain the part on simplex method more?
$endgroup$
– dineshdileep
Mar 16 at 13:54
$begingroup$
Can you explain the part on simplex method more?
$endgroup$
– dineshdileep
Mar 16 at 13:54
$begingroup$
@dineshdileep I would need some time to work out the details and I am not 100% certain it boils down to the same result, but the ratio test in the dual problem considers the same ratios.
$endgroup$
– LinAlg
Mar 16 at 14:20
$begingroup$
@dineshdileep I would need some time to work out the details and I am not 100% certain it boils down to the same result, but the ratio test in the dual problem considers the same ratios.
$endgroup$
– LinAlg
Mar 16 at 14:20
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3150316%2fiterative-algorithm-for-a-simple-linear-optimization-problem%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
$begingroup$
I believe that constructing a one-dimensional Lagrangian dual problem is your way to construct a specialized algorithm. Construct a Lagrangian with a multiplier $lambda > 0$ for the first inequality constraint, and minimize it subject to $x_i in [0, 1]$. You can maximize it any one-dimensional method you like, such as Golden Section search, and then extract the primal optimal by substiuting $lambda^*$ into the Lagrangian minimization problem.
$endgroup$
– Alex Shtof
Mar 17 at 8:09
$begingroup$
Please clear this doubt. Is skipping the multipliers corresponding to $x_iin [0,1]$ is ok? because they are also constraints, right?
$endgroup$
– dineshdileep
Mar 17 at 12:04
$begingroup$
Yes. Duality works for convex problems of the form $min~f(x)~texts.t.~ g_i(x) leq 0~ x in C$. The dual function is $q(lambda) = min_x in C f(x) + sum_i lambda_i g_i(x) $. It is our choice which constraints we treat as $g_i$ and which constraints we treat as $C$. I chose to treat $x_i in [0, 1]$ as the set $C$, and left with only one constraint which has a multiplier.
$endgroup$
– Alex Shtof
Mar 17 at 12:13
$begingroup$
Got, it. Thanks!.
$endgroup$
– dineshdileep
Mar 17 at 12:34
$begingroup$
@AlexShtof how are KKT conditions satisfied here?
$endgroup$
– dineshdileep
Mar 17 at 21:57