Explanation of a convex quadratic programMinimize a complex quadratic subject to two convex quadratic constraintsMaximize Trace as standard semidefinite optimizationLinear programming with non-convex quadratic constraintQuadratic Program ReformulationHow to determine the optimal step size in a quadratic function optimizationComputational complexity of the following quadratic program (QP)quadratic programming positive semidefinite matrix proofConvex matrix optimization with a linear constraint on the inverseQuadratic program reformulation maximum to minimumDoes this special case of convex quadratic programming have a partially-unique solution?

Does the attack bonus from a Masterwork weapon stack with the attack bonus from Masterwork ammunition?

Brake pads destroying wheels

Matrix using tikz package

What is the significance behind "40 days" that often appears in the Bible?

I got the following comment from a reputed math journal. What does it mean?

Why is indicated airspeed rather than ground speed used during the takeoff roll?

How to get the n-th line after a grepped one?

What is the term when voters “dishonestly” choose something that they do not want to choose?

Should I use acronyms in dialogues before telling the readers what it stands for in fiction?

Wrapping homogeneous Python objects

Probably overheated black color SMD pads

What does Deadpool mean by "left the house in that shirt"?

Recruiter wants very extensive technical details about all of my previous work

Do I need to be arrogant to get ahead?

If "dar" means "to give", what does "daros" mean?

Are dual Irish/British citizens bound by the 90/180 day rule when travelling in the EU after Brexit?

Variable completely messes up echoed string

Why didn't Héctor fade away after this character died in the movie Coco?

Is there a hypothetical scenario that would make Earth uninhabitable for humans, but not for (the majority of) other animals?

What (if any) is the reason to buy in small local stores?

My friend is being a hypocrite

Turning a hard to access nut?

How does 取材で訪れた integrate into this sentence?

Deletion of copy-ctor & copy-assignment - public, private or protected?



Explanation of a convex quadratic program


Minimize a complex quadratic subject to two convex quadratic constraintsMaximize Trace as standard semidefinite optimizationLinear programming with non-convex quadratic constraintQuadratic Program ReformulationHow to determine the optimal step size in a quadratic function optimizationComputational complexity of the following quadratic program (QP)quadratic programming positive semidefinite matrix proofConvex matrix optimization with a linear constraint on the inverseQuadratic program reformulation maximum to minimumDoes this special case of convex quadratic programming have a partially-unique solution?













1












$begingroup$


I want to find the minimum of the following optimization problem but don't even understand the problem in the first place.



$$minlimits_AX=bfrac12X^TQX+C^TX+alpha$$



where $Q, A in M_n times n (BbbR)$, $X, b, C in BbbR^n$ and $alpha in BbbR$. Also, matrix $Q$ is symmetric and positive definite.



This is all I have about the problem. So, can anyone please, explain the problem to me? Thanks for your time.










share|cite|improve this question











$endgroup$







  • 1




    $begingroup$
    This kind of optimization problem is called a quadratic program.
    $endgroup$
    – David M.
    Mar 9 at 7:07










  • $begingroup$
    Do you know anything about the matrix Q? Is it positive semidefinite, or is it something else?
    $endgroup$
    – Alex Shtof
    Mar 9 at 11:10










  • $begingroup$
    @Alex Shtof: Yes, it is positive definite and symmetric.
    $endgroup$
    – Omojola Micheal
    Mar 9 at 12:29















1












$begingroup$


I want to find the minimum of the following optimization problem but don't even understand the problem in the first place.



$$minlimits_AX=bfrac12X^TQX+C^TX+alpha$$



where $Q, A in M_n times n (BbbR)$, $X, b, C in BbbR^n$ and $alpha in BbbR$. Also, matrix $Q$ is symmetric and positive definite.



This is all I have about the problem. So, can anyone please, explain the problem to me? Thanks for your time.










share|cite|improve this question











$endgroup$







  • 1




    $begingroup$
    This kind of optimization problem is called a quadratic program.
    $endgroup$
    – David M.
    Mar 9 at 7:07










  • $begingroup$
    Do you know anything about the matrix Q? Is it positive semidefinite, or is it something else?
    $endgroup$
    – Alex Shtof
    Mar 9 at 11:10










  • $begingroup$
    @Alex Shtof: Yes, it is positive definite and symmetric.
    $endgroup$
    – Omojola Micheal
    Mar 9 at 12:29













1












1








1


1



$begingroup$


I want to find the minimum of the following optimization problem but don't even understand the problem in the first place.



$$minlimits_AX=bfrac12X^TQX+C^TX+alpha$$



where $Q, A in M_n times n (BbbR)$, $X, b, C in BbbR^n$ and $alpha in BbbR$. Also, matrix $Q$ is symmetric and positive definite.



This is all I have about the problem. So, can anyone please, explain the problem to me? Thanks for your time.










share|cite|improve this question











$endgroup$




I want to find the minimum of the following optimization problem but don't even understand the problem in the first place.



$$minlimits_AX=bfrac12X^TQX+C^TX+alpha$$



where $Q, A in M_n times n (BbbR)$, $X, b, C in BbbR^n$ and $alpha in BbbR$. Also, matrix $Q$ is symmetric and positive definite.



This is all I have about the problem. So, can anyone please, explain the problem to me? Thanks for your time.







optimization definition convex-optimization quadratic-programming






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Mar 12 at 19:19









David M.

2,003418




2,003418










asked Mar 9 at 7:04









Omojola MichealOmojola Micheal

1,971324




1,971324







  • 1




    $begingroup$
    This kind of optimization problem is called a quadratic program.
    $endgroup$
    – David M.
    Mar 9 at 7:07










  • $begingroup$
    Do you know anything about the matrix Q? Is it positive semidefinite, or is it something else?
    $endgroup$
    – Alex Shtof
    Mar 9 at 11:10










  • $begingroup$
    @Alex Shtof: Yes, it is positive definite and symmetric.
    $endgroup$
    – Omojola Micheal
    Mar 9 at 12:29












  • 1




    $begingroup$
    This kind of optimization problem is called a quadratic program.
    $endgroup$
    – David M.
    Mar 9 at 7:07










  • $begingroup$
    Do you know anything about the matrix Q? Is it positive semidefinite, or is it something else?
    $endgroup$
    – Alex Shtof
    Mar 9 at 11:10










  • $begingroup$
    @Alex Shtof: Yes, it is positive definite and symmetric.
    $endgroup$
    – Omojola Micheal
    Mar 9 at 12:29







1




1




$begingroup$
This kind of optimization problem is called a quadratic program.
$endgroup$
– David M.
Mar 9 at 7:07




$begingroup$
This kind of optimization problem is called a quadratic program.
$endgroup$
– David M.
Mar 9 at 7:07












$begingroup$
Do you know anything about the matrix Q? Is it positive semidefinite, or is it something else?
$endgroup$
– Alex Shtof
Mar 9 at 11:10




$begingroup$
Do you know anything about the matrix Q? Is it positive semidefinite, or is it something else?
$endgroup$
– Alex Shtof
Mar 9 at 11:10












$begingroup$
@Alex Shtof: Yes, it is positive definite and symmetric.
$endgroup$
– Omojola Micheal
Mar 9 at 12:29




$begingroup$
@Alex Shtof: Yes, it is positive definite and symmetric.
$endgroup$
– Omojola Micheal
Mar 9 at 12:29










2 Answers
2






active

oldest

votes


















2












$begingroup$

You are given two fixed $n times n$ matrices $Q$ and $A$, two fixed n-dimensional vectors $B$ and $C$, and a fixed real number $alpha$. You are supposed to miminimize the value of the objective function $f(X)=tfrac12 X^T Q X + B^T X + alpha$ by varying $X$, subject to the constraint $AX=B$.



So, if we define $S = X in mathbbR^n : AX=B$, then you need to find $bar X in S$ Such that $f(bar X) le f(X)$ for all $X in S$.



So, this is a quadratic programming problem with a set of linear equality constraints.






share|cite|improve this answer











$endgroup$












  • $begingroup$
    +1. Thanks for the answer. So, is there a natural way I can solve this problem?
    $endgroup$
    – Omojola Micheal
    Mar 9 at 7:41






  • 1




    $begingroup$
    Look up Lagrange multipliers. Or, presumably your course material tells you how to solve such problems, otherwise it's an inappropriate homework problem.
    $endgroup$
    – bubba
    Mar 9 at 7:49










  • $begingroup$
    Thanks, I now know what Theorem to apply. However, how do I show that $S$ is bounded? I have shown it is closed.
    $endgroup$
    – Omojola Micheal
    Mar 9 at 8:05











  • $begingroup$
    I don't think that $S$ is bounded, in general.
    $endgroup$
    – bubba
    Mar 9 at 11:47


















2












$begingroup$

You were already provided an accepted answer, but I will still attempt to provide more information. I assume that $A in mathbbR^m times n$ where $m < n$, that is, the set $S = X in mathbbR^n : A X = B $ is infinite (otherwise, the problem is trivial). Note, that the set $S$ is affine, and therefore convex.



First, observe that since $Q$ is positive definite, the quadratic objective function is coercive, and therefore a minimizer exists. Since it is strictly convex (the Hessian is $2Q$, which is positive definite), the minimizer is unique.



Now, you have several strategies. The first one, is feeding your problem into any Quadratic Programming solver software like the first answer suggests, and it will give you the (unique) solution. The major drawback of this approach is that such solvers are not always fast. But if it is good enough for you, you do not need to do anything else. Below are two other strategies.



Via KKT conditions



The convex programming problem you seek to solve trivially satisfies Slater's condition, since there are no inequality constraints. Thus, any point is optimal if and only if it satisfies the KKT conditions.



In your case, the Lagrangian is
$$L(X; lambda) = X^T Q X + C^T X + alpha + lambda^T (A X - B),$$
and therefore the KKT conditions are
$$
2 Q X + C + A^T lambda = 0, quad A X = B,
$$

or equivalently written
$$
beginbmatrix 2 Q & A^T \ A & 0 endbmatrix beginbmatrix X \ lambda endbmatrix = beginbmatrix -C \ B endbmatrix.
$$

This is a linear system of equations. Any solution of this system using your favorite linear solver gives you an optimal solution of your desired optimization problem. If you can exploit the special structure of $Q$ or $A$ to solve it quickly, you can solve the optimization problem quickly.



By constraint elimination



Recall, that any rectangular matrix $V$ with more rows than columns admists a QR decomposition into $V = Q R$, where $Q$ is orthogonal and $R$ is upper-triangular. There are also plenty of numerical libraries which can compute it. Since you already used the name $Q$, I will rename your $Q$ matrix into $P$.



By computing the decomposition of $A^T$, you have $A^T = Q R$, and the problem becomes
$$
min_X X^T P X + C^T X + alpha quad texts.t. quad R^T Q^T X = B
$$

Substituting $Q^T X = Y$, or $X = Q Y$ (since $Q$ is orthogonal), we obtain
$$
min_Y Y^T (Q^T P Q) Y + (Q^T C) Y + alpha quad texts.t. quad R^T X = B
$$

Finally, $R$ is upper triangular. Meaning that it has the following structure
$$
R = beginbmatrix tildeR \ 0 endbmatrix,
$$

where $tildeR$ is upper triangular and invertible, which is followed by rows full of zeros. Therefore, by decomposing $Y = beginbmatrixY_1 \ Y_2 endbmatrix$, where $Y_1$ are the components which correspond to the columns of $tildeR$, the constraint $R^T Y = B$ can be written as $tildeR^T Y_1 = B$. You can solve this system, find the value of $Y_1$, substitute it into the objective, and obtain an unconstrained problem with $Y_1$, which can be solved any off-the-shelf least-squares solver.



This approach is useful if you need to solve many problems which various matrices $P$, but all share the same matrix $A$. Moreover, it is useful if the dimension of the remaining variable $Y_2$ is much smaller than the dimension of $X$. Constraint elimination allows you to reduce the dimension of the given problem substantially in that case.






share|cite|improve this answer











$endgroup$












    Your Answer





    StackExchange.ifUsing("editor", function ()
    return StackExchange.using("mathjaxEditing", function ()
    StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
    StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
    );
    );
    , "mathjax-editing");

    StackExchange.ready(function()
    var channelOptions =
    tags: "".split(" "),
    id: "69"
    ;
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function()
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled)
    StackExchange.using("snippets", function()
    createEditor();
    );

    else
    createEditor();

    );

    function createEditor()
    StackExchange.prepareEditor(
    heartbeatType: 'answer',
    autoActivateHeartbeat: false,
    convertImagesToLinks: true,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    imageUploader:
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    ,
    noCode: true, onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    );



    );













    draft saved

    draft discarded


















    StackExchange.ready(
    function ()
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3140869%2fexplanation-of-a-convex-quadratic-program%23new-answer', 'question_page');

    );

    Post as a guest















    Required, but never shown

























    2 Answers
    2






    active

    oldest

    votes








    2 Answers
    2






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    2












    $begingroup$

    You are given two fixed $n times n$ matrices $Q$ and $A$, two fixed n-dimensional vectors $B$ and $C$, and a fixed real number $alpha$. You are supposed to miminimize the value of the objective function $f(X)=tfrac12 X^T Q X + B^T X + alpha$ by varying $X$, subject to the constraint $AX=B$.



    So, if we define $S = X in mathbbR^n : AX=B$, then you need to find $bar X in S$ Such that $f(bar X) le f(X)$ for all $X in S$.



    So, this is a quadratic programming problem with a set of linear equality constraints.






    share|cite|improve this answer











    $endgroup$












    • $begingroup$
      +1. Thanks for the answer. So, is there a natural way I can solve this problem?
      $endgroup$
      – Omojola Micheal
      Mar 9 at 7:41






    • 1




      $begingroup$
      Look up Lagrange multipliers. Or, presumably your course material tells you how to solve such problems, otherwise it's an inappropriate homework problem.
      $endgroup$
      – bubba
      Mar 9 at 7:49










    • $begingroup$
      Thanks, I now know what Theorem to apply. However, how do I show that $S$ is bounded? I have shown it is closed.
      $endgroup$
      – Omojola Micheal
      Mar 9 at 8:05











    • $begingroup$
      I don't think that $S$ is bounded, in general.
      $endgroup$
      – bubba
      Mar 9 at 11:47















    2












    $begingroup$

    You are given two fixed $n times n$ matrices $Q$ and $A$, two fixed n-dimensional vectors $B$ and $C$, and a fixed real number $alpha$. You are supposed to miminimize the value of the objective function $f(X)=tfrac12 X^T Q X + B^T X + alpha$ by varying $X$, subject to the constraint $AX=B$.



    So, if we define $S = X in mathbbR^n : AX=B$, then you need to find $bar X in S$ Such that $f(bar X) le f(X)$ for all $X in S$.



    So, this is a quadratic programming problem with a set of linear equality constraints.






    share|cite|improve this answer











    $endgroup$












    • $begingroup$
      +1. Thanks for the answer. So, is there a natural way I can solve this problem?
      $endgroup$
      – Omojola Micheal
      Mar 9 at 7:41






    • 1




      $begingroup$
      Look up Lagrange multipliers. Or, presumably your course material tells you how to solve such problems, otherwise it's an inappropriate homework problem.
      $endgroup$
      – bubba
      Mar 9 at 7:49










    • $begingroup$
      Thanks, I now know what Theorem to apply. However, how do I show that $S$ is bounded? I have shown it is closed.
      $endgroup$
      – Omojola Micheal
      Mar 9 at 8:05











    • $begingroup$
      I don't think that $S$ is bounded, in general.
      $endgroup$
      – bubba
      Mar 9 at 11:47













    2












    2








    2





    $begingroup$

    You are given two fixed $n times n$ matrices $Q$ and $A$, two fixed n-dimensional vectors $B$ and $C$, and a fixed real number $alpha$. You are supposed to miminimize the value of the objective function $f(X)=tfrac12 X^T Q X + B^T X + alpha$ by varying $X$, subject to the constraint $AX=B$.



    So, if we define $S = X in mathbbR^n : AX=B$, then you need to find $bar X in S$ Such that $f(bar X) le f(X)$ for all $X in S$.



    So, this is a quadratic programming problem with a set of linear equality constraints.






    share|cite|improve this answer











    $endgroup$



    You are given two fixed $n times n$ matrices $Q$ and $A$, two fixed n-dimensional vectors $B$ and $C$, and a fixed real number $alpha$. You are supposed to miminimize the value of the objective function $f(X)=tfrac12 X^T Q X + B^T X + alpha$ by varying $X$, subject to the constraint $AX=B$.



    So, if we define $S = X in mathbbR^n : AX=B$, then you need to find $bar X in S$ Such that $f(bar X) le f(X)$ for all $X in S$.



    So, this is a quadratic programming problem with a set of linear equality constraints.







    share|cite|improve this answer














    share|cite|improve this answer



    share|cite|improve this answer








    edited Mar 9 at 7:47

























    answered Mar 9 at 7:39









    bubbabubba

    30.7k33188




    30.7k33188











    • $begingroup$
      +1. Thanks for the answer. So, is there a natural way I can solve this problem?
      $endgroup$
      – Omojola Micheal
      Mar 9 at 7:41






    • 1




      $begingroup$
      Look up Lagrange multipliers. Or, presumably your course material tells you how to solve such problems, otherwise it's an inappropriate homework problem.
      $endgroup$
      – bubba
      Mar 9 at 7:49










    • $begingroup$
      Thanks, I now know what Theorem to apply. However, how do I show that $S$ is bounded? I have shown it is closed.
      $endgroup$
      – Omojola Micheal
      Mar 9 at 8:05











    • $begingroup$
      I don't think that $S$ is bounded, in general.
      $endgroup$
      – bubba
      Mar 9 at 11:47
















    • $begingroup$
      +1. Thanks for the answer. So, is there a natural way I can solve this problem?
      $endgroup$
      – Omojola Micheal
      Mar 9 at 7:41






    • 1




      $begingroup$
      Look up Lagrange multipliers. Or, presumably your course material tells you how to solve such problems, otherwise it's an inappropriate homework problem.
      $endgroup$
      – bubba
      Mar 9 at 7:49










    • $begingroup$
      Thanks, I now know what Theorem to apply. However, how do I show that $S$ is bounded? I have shown it is closed.
      $endgroup$
      – Omojola Micheal
      Mar 9 at 8:05











    • $begingroup$
      I don't think that $S$ is bounded, in general.
      $endgroup$
      – bubba
      Mar 9 at 11:47















    $begingroup$
    +1. Thanks for the answer. So, is there a natural way I can solve this problem?
    $endgroup$
    – Omojola Micheal
    Mar 9 at 7:41




    $begingroup$
    +1. Thanks for the answer. So, is there a natural way I can solve this problem?
    $endgroup$
    – Omojola Micheal
    Mar 9 at 7:41




    1




    1




    $begingroup$
    Look up Lagrange multipliers. Or, presumably your course material tells you how to solve such problems, otherwise it's an inappropriate homework problem.
    $endgroup$
    – bubba
    Mar 9 at 7:49




    $begingroup$
    Look up Lagrange multipliers. Or, presumably your course material tells you how to solve such problems, otherwise it's an inappropriate homework problem.
    $endgroup$
    – bubba
    Mar 9 at 7:49












    $begingroup$
    Thanks, I now know what Theorem to apply. However, how do I show that $S$ is bounded? I have shown it is closed.
    $endgroup$
    – Omojola Micheal
    Mar 9 at 8:05





    $begingroup$
    Thanks, I now know what Theorem to apply. However, how do I show that $S$ is bounded? I have shown it is closed.
    $endgroup$
    – Omojola Micheal
    Mar 9 at 8:05













    $begingroup$
    I don't think that $S$ is bounded, in general.
    $endgroup$
    – bubba
    Mar 9 at 11:47




    $begingroup$
    I don't think that $S$ is bounded, in general.
    $endgroup$
    – bubba
    Mar 9 at 11:47











    2












    $begingroup$

    You were already provided an accepted answer, but I will still attempt to provide more information. I assume that $A in mathbbR^m times n$ where $m < n$, that is, the set $S = X in mathbbR^n : A X = B $ is infinite (otherwise, the problem is trivial). Note, that the set $S$ is affine, and therefore convex.



    First, observe that since $Q$ is positive definite, the quadratic objective function is coercive, and therefore a minimizer exists. Since it is strictly convex (the Hessian is $2Q$, which is positive definite), the minimizer is unique.



    Now, you have several strategies. The first one, is feeding your problem into any Quadratic Programming solver software like the first answer suggests, and it will give you the (unique) solution. The major drawback of this approach is that such solvers are not always fast. But if it is good enough for you, you do not need to do anything else. Below are two other strategies.



    Via KKT conditions



    The convex programming problem you seek to solve trivially satisfies Slater's condition, since there are no inequality constraints. Thus, any point is optimal if and only if it satisfies the KKT conditions.



    In your case, the Lagrangian is
    $$L(X; lambda) = X^T Q X + C^T X + alpha + lambda^T (A X - B),$$
    and therefore the KKT conditions are
    $$
    2 Q X + C + A^T lambda = 0, quad A X = B,
    $$

    or equivalently written
    $$
    beginbmatrix 2 Q & A^T \ A & 0 endbmatrix beginbmatrix X \ lambda endbmatrix = beginbmatrix -C \ B endbmatrix.
    $$

    This is a linear system of equations. Any solution of this system using your favorite linear solver gives you an optimal solution of your desired optimization problem. If you can exploit the special structure of $Q$ or $A$ to solve it quickly, you can solve the optimization problem quickly.



    By constraint elimination



    Recall, that any rectangular matrix $V$ with more rows than columns admists a QR decomposition into $V = Q R$, where $Q$ is orthogonal and $R$ is upper-triangular. There are also plenty of numerical libraries which can compute it. Since you already used the name $Q$, I will rename your $Q$ matrix into $P$.



    By computing the decomposition of $A^T$, you have $A^T = Q R$, and the problem becomes
    $$
    min_X X^T P X + C^T X + alpha quad texts.t. quad R^T Q^T X = B
    $$

    Substituting $Q^T X = Y$, or $X = Q Y$ (since $Q$ is orthogonal), we obtain
    $$
    min_Y Y^T (Q^T P Q) Y + (Q^T C) Y + alpha quad texts.t. quad R^T X = B
    $$

    Finally, $R$ is upper triangular. Meaning that it has the following structure
    $$
    R = beginbmatrix tildeR \ 0 endbmatrix,
    $$

    where $tildeR$ is upper triangular and invertible, which is followed by rows full of zeros. Therefore, by decomposing $Y = beginbmatrixY_1 \ Y_2 endbmatrix$, where $Y_1$ are the components which correspond to the columns of $tildeR$, the constraint $R^T Y = B$ can be written as $tildeR^T Y_1 = B$. You can solve this system, find the value of $Y_1$, substitute it into the objective, and obtain an unconstrained problem with $Y_1$, which can be solved any off-the-shelf least-squares solver.



    This approach is useful if you need to solve many problems which various matrices $P$, but all share the same matrix $A$. Moreover, it is useful if the dimension of the remaining variable $Y_2$ is much smaller than the dimension of $X$. Constraint elimination allows you to reduce the dimension of the given problem substantially in that case.






    share|cite|improve this answer











    $endgroup$

















      2












      $begingroup$

      You were already provided an accepted answer, but I will still attempt to provide more information. I assume that $A in mathbbR^m times n$ where $m < n$, that is, the set $S = X in mathbbR^n : A X = B $ is infinite (otherwise, the problem is trivial). Note, that the set $S$ is affine, and therefore convex.



      First, observe that since $Q$ is positive definite, the quadratic objective function is coercive, and therefore a minimizer exists. Since it is strictly convex (the Hessian is $2Q$, which is positive definite), the minimizer is unique.



      Now, you have several strategies. The first one, is feeding your problem into any Quadratic Programming solver software like the first answer suggests, and it will give you the (unique) solution. The major drawback of this approach is that such solvers are not always fast. But if it is good enough for you, you do not need to do anything else. Below are two other strategies.



      Via KKT conditions



      The convex programming problem you seek to solve trivially satisfies Slater's condition, since there are no inequality constraints. Thus, any point is optimal if and only if it satisfies the KKT conditions.



      In your case, the Lagrangian is
      $$L(X; lambda) = X^T Q X + C^T X + alpha + lambda^T (A X - B),$$
      and therefore the KKT conditions are
      $$
      2 Q X + C + A^T lambda = 0, quad A X = B,
      $$

      or equivalently written
      $$
      beginbmatrix 2 Q & A^T \ A & 0 endbmatrix beginbmatrix X \ lambda endbmatrix = beginbmatrix -C \ B endbmatrix.
      $$

      This is a linear system of equations. Any solution of this system using your favorite linear solver gives you an optimal solution of your desired optimization problem. If you can exploit the special structure of $Q$ or $A$ to solve it quickly, you can solve the optimization problem quickly.



      By constraint elimination



      Recall, that any rectangular matrix $V$ with more rows than columns admists a QR decomposition into $V = Q R$, where $Q$ is orthogonal and $R$ is upper-triangular. There are also plenty of numerical libraries which can compute it. Since you already used the name $Q$, I will rename your $Q$ matrix into $P$.



      By computing the decomposition of $A^T$, you have $A^T = Q R$, and the problem becomes
      $$
      min_X X^T P X + C^T X + alpha quad texts.t. quad R^T Q^T X = B
      $$

      Substituting $Q^T X = Y$, or $X = Q Y$ (since $Q$ is orthogonal), we obtain
      $$
      min_Y Y^T (Q^T P Q) Y + (Q^T C) Y + alpha quad texts.t. quad R^T X = B
      $$

      Finally, $R$ is upper triangular. Meaning that it has the following structure
      $$
      R = beginbmatrix tildeR \ 0 endbmatrix,
      $$

      where $tildeR$ is upper triangular and invertible, which is followed by rows full of zeros. Therefore, by decomposing $Y = beginbmatrixY_1 \ Y_2 endbmatrix$, where $Y_1$ are the components which correspond to the columns of $tildeR$, the constraint $R^T Y = B$ can be written as $tildeR^T Y_1 = B$. You can solve this system, find the value of $Y_1$, substitute it into the objective, and obtain an unconstrained problem with $Y_1$, which can be solved any off-the-shelf least-squares solver.



      This approach is useful if you need to solve many problems which various matrices $P$, but all share the same matrix $A$. Moreover, it is useful if the dimension of the remaining variable $Y_2$ is much smaller than the dimension of $X$. Constraint elimination allows you to reduce the dimension of the given problem substantially in that case.






      share|cite|improve this answer











      $endgroup$















        2












        2








        2





        $begingroup$

        You were already provided an accepted answer, but I will still attempt to provide more information. I assume that $A in mathbbR^m times n$ where $m < n$, that is, the set $S = X in mathbbR^n : A X = B $ is infinite (otherwise, the problem is trivial). Note, that the set $S$ is affine, and therefore convex.



        First, observe that since $Q$ is positive definite, the quadratic objective function is coercive, and therefore a minimizer exists. Since it is strictly convex (the Hessian is $2Q$, which is positive definite), the minimizer is unique.



        Now, you have several strategies. The first one, is feeding your problem into any Quadratic Programming solver software like the first answer suggests, and it will give you the (unique) solution. The major drawback of this approach is that such solvers are not always fast. But if it is good enough for you, you do not need to do anything else. Below are two other strategies.



        Via KKT conditions



        The convex programming problem you seek to solve trivially satisfies Slater's condition, since there are no inequality constraints. Thus, any point is optimal if and only if it satisfies the KKT conditions.



        In your case, the Lagrangian is
        $$L(X; lambda) = X^T Q X + C^T X + alpha + lambda^T (A X - B),$$
        and therefore the KKT conditions are
        $$
        2 Q X + C + A^T lambda = 0, quad A X = B,
        $$

        or equivalently written
        $$
        beginbmatrix 2 Q & A^T \ A & 0 endbmatrix beginbmatrix X \ lambda endbmatrix = beginbmatrix -C \ B endbmatrix.
        $$

        This is a linear system of equations. Any solution of this system using your favorite linear solver gives you an optimal solution of your desired optimization problem. If you can exploit the special structure of $Q$ or $A$ to solve it quickly, you can solve the optimization problem quickly.



        By constraint elimination



        Recall, that any rectangular matrix $V$ with more rows than columns admists a QR decomposition into $V = Q R$, where $Q$ is orthogonal and $R$ is upper-triangular. There are also plenty of numerical libraries which can compute it. Since you already used the name $Q$, I will rename your $Q$ matrix into $P$.



        By computing the decomposition of $A^T$, you have $A^T = Q R$, and the problem becomes
        $$
        min_X X^T P X + C^T X + alpha quad texts.t. quad R^T Q^T X = B
        $$

        Substituting $Q^T X = Y$, or $X = Q Y$ (since $Q$ is orthogonal), we obtain
        $$
        min_Y Y^T (Q^T P Q) Y + (Q^T C) Y + alpha quad texts.t. quad R^T X = B
        $$

        Finally, $R$ is upper triangular. Meaning that it has the following structure
        $$
        R = beginbmatrix tildeR \ 0 endbmatrix,
        $$

        where $tildeR$ is upper triangular and invertible, which is followed by rows full of zeros. Therefore, by decomposing $Y = beginbmatrixY_1 \ Y_2 endbmatrix$, where $Y_1$ are the components which correspond to the columns of $tildeR$, the constraint $R^T Y = B$ can be written as $tildeR^T Y_1 = B$. You can solve this system, find the value of $Y_1$, substitute it into the objective, and obtain an unconstrained problem with $Y_1$, which can be solved any off-the-shelf least-squares solver.



        This approach is useful if you need to solve many problems which various matrices $P$, but all share the same matrix $A$. Moreover, it is useful if the dimension of the remaining variable $Y_2$ is much smaller than the dimension of $X$. Constraint elimination allows you to reduce the dimension of the given problem substantially in that case.






        share|cite|improve this answer











        $endgroup$



        You were already provided an accepted answer, but I will still attempt to provide more information. I assume that $A in mathbbR^m times n$ where $m < n$, that is, the set $S = X in mathbbR^n : A X = B $ is infinite (otherwise, the problem is trivial). Note, that the set $S$ is affine, and therefore convex.



        First, observe that since $Q$ is positive definite, the quadratic objective function is coercive, and therefore a minimizer exists. Since it is strictly convex (the Hessian is $2Q$, which is positive definite), the minimizer is unique.



        Now, you have several strategies. The first one, is feeding your problem into any Quadratic Programming solver software like the first answer suggests, and it will give you the (unique) solution. The major drawback of this approach is that such solvers are not always fast. But if it is good enough for you, you do not need to do anything else. Below are two other strategies.



        Via KKT conditions



        The convex programming problem you seek to solve trivially satisfies Slater's condition, since there are no inequality constraints. Thus, any point is optimal if and only if it satisfies the KKT conditions.



        In your case, the Lagrangian is
        $$L(X; lambda) = X^T Q X + C^T X + alpha + lambda^T (A X - B),$$
        and therefore the KKT conditions are
        $$
        2 Q X + C + A^T lambda = 0, quad A X = B,
        $$

        or equivalently written
        $$
        beginbmatrix 2 Q & A^T \ A & 0 endbmatrix beginbmatrix X \ lambda endbmatrix = beginbmatrix -C \ B endbmatrix.
        $$

        This is a linear system of equations. Any solution of this system using your favorite linear solver gives you an optimal solution of your desired optimization problem. If you can exploit the special structure of $Q$ or $A$ to solve it quickly, you can solve the optimization problem quickly.



        By constraint elimination



        Recall, that any rectangular matrix $V$ with more rows than columns admists a QR decomposition into $V = Q R$, where $Q$ is orthogonal and $R$ is upper-triangular. There are also plenty of numerical libraries which can compute it. Since you already used the name $Q$, I will rename your $Q$ matrix into $P$.



        By computing the decomposition of $A^T$, you have $A^T = Q R$, and the problem becomes
        $$
        min_X X^T P X + C^T X + alpha quad texts.t. quad R^T Q^T X = B
        $$

        Substituting $Q^T X = Y$, or $X = Q Y$ (since $Q$ is orthogonal), we obtain
        $$
        min_Y Y^T (Q^T P Q) Y + (Q^T C) Y + alpha quad texts.t. quad R^T X = B
        $$

        Finally, $R$ is upper triangular. Meaning that it has the following structure
        $$
        R = beginbmatrix tildeR \ 0 endbmatrix,
        $$

        where $tildeR$ is upper triangular and invertible, which is followed by rows full of zeros. Therefore, by decomposing $Y = beginbmatrixY_1 \ Y_2 endbmatrix$, where $Y_1$ are the components which correspond to the columns of $tildeR$, the constraint $R^T Y = B$ can be written as $tildeR^T Y_1 = B$. You can solve this system, find the value of $Y_1$, substitute it into the objective, and obtain an unconstrained problem with $Y_1$, which can be solved any off-the-shelf least-squares solver.



        This approach is useful if you need to solve many problems which various matrices $P$, but all share the same matrix $A$. Moreover, it is useful if the dimension of the remaining variable $Y_2$ is much smaller than the dimension of $X$. Constraint elimination allows you to reduce the dimension of the given problem substantially in that case.







        share|cite|improve this answer














        share|cite|improve this answer



        share|cite|improve this answer








        edited Mar 13 at 11:16

























        answered Mar 10 at 11:52









        Alex ShtofAlex Shtof

        604516




        604516



























            draft saved

            draft discarded
















































            Thanks for contributing an answer to Mathematics Stack Exchange!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid


            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.

            Use MathJax to format equations. MathJax reference.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function ()
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3140869%2fexplanation-of-a-convex-quadratic-program%23new-answer', 'question_page');

            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            Solar Wings Breeze Design and development Specifications (Breeze) References Navigation menu1368-485X"Hang glider: Breeze (Solar Wings)"e

            Kathakali Contents Etymology and nomenclature History Repertoire Songs and musical instruments Traditional plays Styles: Sampradayam Training centers and awards Relationship to other dance forms See also Notes References External links Navigation menueThe Illustrated Encyclopedia of Hinduism: A-MSouth Asian Folklore: An EncyclopediaRoutledge International Encyclopedia of Women: Global Women's Issues and KnowledgeKathakali Dance-drama: Where Gods and Demons Come to PlayKathakali Dance-drama: Where Gods and Demons Come to PlayKathakali Dance-drama: Where Gods and Demons Come to Play10.1353/atj.2005.0004The Illustrated Encyclopedia of Hinduism: A-MEncyclopedia of HinduismKathakali Dance-drama: Where Gods and Demons Come to PlaySonic Liturgy: Ritual and Music in Hindu Tradition"The Mirror of Gesture"Kathakali Dance-drama: Where Gods and Demons Come to Play"Kathakali"Indian Theatre: Traditions of PerformanceIndian Theatre: Traditions of PerformanceIndian Theatre: Traditions of PerformanceIndian Theatre: Traditions of PerformanceMedieval Indian Literature: An AnthologyThe Oxford Companion to Indian TheatreSouth Asian Folklore: An Encyclopedia : Afghanistan, Bangladesh, India, Nepal, Pakistan, Sri LankaThe Rise of Performance Studies: Rethinking Richard Schechner's Broad SpectrumIndian Theatre: Traditions of PerformanceModern Asian Theatre and Performance 1900-2000Critical Theory and PerformanceBetween Theater and AnthropologyKathakali603847011Indian Theatre: Traditions of PerformanceIndian Theatre: Traditions of PerformanceIndian Theatre: Traditions of PerformanceBetween Theater and AnthropologyBetween Theater and AnthropologyNambeesan Smaraka AwardsArchivedThe Cambridge Guide to TheatreRoutledge International Encyclopedia of Women: Global Women's Issues and KnowledgeThe Garland Encyclopedia of World Music: South Asia : the Indian subcontinentThe Ethos of Noh: Actors and Their Art10.2307/1145740By Means of Performance: Intercultural Studies of Theatre and Ritual10.1017/s204912550000100xReconceiving the Renaissance: A Critical ReaderPerformance TheoryListening to Theatre: The Aural Dimension of Beijing Opera10.2307/1146013Kathakali: The Art of the Non-WorldlyOn KathakaliKathakali, the dance theatreThe Kathakali Complex: Performance & StructureKathakali Dance-Drama: Where Gods and Demons Come to Play10.1093/obo/9780195399318-0071Drama and Ritual of Early Hinduism"In the Shadow of Hollywood Orientalism: Authentic East Indian Dancing"10.1080/08949460490274013Sanskrit Play Production in Ancient IndiaIndian Music: History and StructureBharata, the Nāṭyaśāstra233639306Table of Contents2238067286469807Dance In Indian Painting10.2307/32047833204783Kathakali Dance-Theatre: A Visual Narrative of Sacred Indian MimeIndian Classical Dance: The Renaissance and BeyondKathakali: an indigenous art-form of Keralaeee

            Method to test if a number is a perfect power? Announcing the arrival of Valued Associate #679: Cesar Manara Planned maintenance scheduled April 23, 2019 at 00:00UTC (8:00pm US/Eastern)Detecting perfect squares faster than by extracting square rooteffective way to get the integer sequence A181392 from oeisA rarely mentioned fact about perfect powersHow many numbers such $n$ are there that $n<100,lfloorsqrtn rfloor mid n$Check perfect squareness by modulo division against multiple basesFor what pair of integers $(a,b)$ is $3^a + 7^b$ a perfect square.Do there exist any positive integers $n$ such that $lfloore^nrfloor$ is a perfect power? What is the probability that one exists?finding perfect power factors of an integerProve that the sequence contains a perfect square for any natural number $m $ in the domain of $f$ .Counting Perfect Powers