Creating a matrix to project vectors in $5$-dimensional space to $3$-dimensional sub space through a point?An orthonormal set cannot be a basis in an infinite dimension vector space?How to project a n-dimensional point onto a 2-D subspace?Projection Matrix between two VectorsQuestion about maximal orthonormal subset in infinite dimensional vector spaceUsing my orthonormal basis to find a polynomial that best approximates the polynomial $t^3$infinite dimensional hilbert space - uniqueness of series expansionCreating an orthonormal basisWhat is the maximum number of quadrants in $n$ dimensional space that a $k$ dimensional hyperplane can pass through?Is zero vector always present in any n-dimensional space?In 3D space, how to project a point into x, y and z?

How does learning spells work when leveling a multiclass character?

Short story about cities being connected by a conveyor belt

Is it appropriate to ask a former professor to order a library book for me through ILL?

Having the player face themselves after the mid-game

Use Mercury as quenching liquid for swords?

What is the purpose of a disclaimer like "this is not legal advice"?

Generating a list with duplicate entries

How to distinguish easily different soldier of ww2?

Has a sovereign Communist government ever run, and conceded loss, on a fair election?

Does the US political system, in principle, allow for a no-party system?

How would an energy-based "projectile" blow up a spaceship?

Why do we say 'Pairwise Disjoint', rather than 'Disjoint'?

Interpretation of linear regression interaction term plot

Is the differential, dp, exact or not?

Why restrict private health insurance?

Can I challenge the interviewer to give me a proper technical feedback?

Should we avoid writing fiction about historical events without extensive research?

Are small insurances worth it?

Why aren't there more Gauls like Obelix?

Why do phishing e-mails use faked e-mail addresses instead of the real one?

What is the oldest European royal house?

Help! My Character is too much for her story!

Too soon for a plot twist?

How spaceships determine each other's mass in space?



Creating a matrix to project vectors in $5$-dimensional space to $3$-dimensional sub space through a point?


An orthonormal set cannot be a basis in an infinite dimension vector space?How to project a n-dimensional point onto a 2-D subspace?Projection Matrix between two VectorsQuestion about maximal orthonormal subset in infinite dimensional vector spaceUsing my orthonormal basis to find a polynomial that best approximates the polynomial $t^3$infinite dimensional hilbert space - uniqueness of series expansionCreating an orthonormal basisWhat is the maximum number of quadrants in $n$ dimensional space that a $k$ dimensional hyperplane can pass through?Is zero vector always present in any n-dimensional space?In 3D space, how to project a point into x, y and z?













0












$begingroup$


I am trying to self teach matrices. I am confused as to how to create a matrix to project vectors in $5$-dimensional space to a $3$-dimensional sub-space that passes through the origin and the points $(2, 0, 0, −2, 2), (0, 2, 2, 0, 0)$ and $(2, 0, 0, 2, 0)$, by using an orthonormal basis.



I understand how to project a vector $V$ onto a vector $W$ ( $fracVcdot WWcdot WW$ ), I have only used this to project $2$d vectors onto other $2$d vectors but I am not sure how to apply it to this scenario ($5$-dimension space to $3$-dimensional space).



I am also unsure as to how to project it through a certain point as given above.



Any help would be very appreciated.










share|cite|improve this question









New contributor




John A Smith is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.







$endgroup$
















    0












    $begingroup$


    I am trying to self teach matrices. I am confused as to how to create a matrix to project vectors in $5$-dimensional space to a $3$-dimensional sub-space that passes through the origin and the points $(2, 0, 0, −2, 2), (0, 2, 2, 0, 0)$ and $(2, 0, 0, 2, 0)$, by using an orthonormal basis.



    I understand how to project a vector $V$ onto a vector $W$ ( $fracVcdot WWcdot WW$ ), I have only used this to project $2$d vectors onto other $2$d vectors but I am not sure how to apply it to this scenario ($5$-dimension space to $3$-dimensional space).



    I am also unsure as to how to project it through a certain point as given above.



    Any help would be very appreciated.










    share|cite|improve this question









    New contributor




    John A Smith is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
    Check out our Code of Conduct.







    $endgroup$














      0












      0








      0





      $begingroup$


      I am trying to self teach matrices. I am confused as to how to create a matrix to project vectors in $5$-dimensional space to a $3$-dimensional sub-space that passes through the origin and the points $(2, 0, 0, −2, 2), (0, 2, 2, 0, 0)$ and $(2, 0, 0, 2, 0)$, by using an orthonormal basis.



      I understand how to project a vector $V$ onto a vector $W$ ( $fracVcdot WWcdot WW$ ), I have only used this to project $2$d vectors onto other $2$d vectors but I am not sure how to apply it to this scenario ($5$-dimension space to $3$-dimensional space).



      I am also unsure as to how to project it through a certain point as given above.



      Any help would be very appreciated.










      share|cite|improve this question









      New contributor




      John A Smith is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
      Check out our Code of Conduct.







      $endgroup$




      I am trying to self teach matrices. I am confused as to how to create a matrix to project vectors in $5$-dimensional space to a $3$-dimensional sub-space that passes through the origin and the points $(2, 0, 0, −2, 2), (0, 2, 2, 0, 0)$ and $(2, 0, 0, 2, 0)$, by using an orthonormal basis.



      I understand how to project a vector $V$ onto a vector $W$ ( $fracVcdot WWcdot WW$ ), I have only used this to project $2$d vectors onto other $2$d vectors but I am not sure how to apply it to this scenario ($5$-dimension space to $3$-dimensional space).



      I am also unsure as to how to project it through a certain point as given above.



      Any help would be very appreciated.







      linear-algebra matrices orthonormal






      share|cite|improve this question









      New contributor




      John A Smith is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
      Check out our Code of Conduct.











      share|cite|improve this question









      New contributor




      John A Smith is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
      Check out our Code of Conduct.









      share|cite|improve this question




      share|cite|improve this question








      edited 15 hours ago









      Daniele Tampieri

      2,3972922




      2,3972922






      New contributor




      John A Smith is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
      Check out our Code of Conduct.









      asked 16 hours ago









      John A SmithJohn A Smith

      1




      1




      New contributor




      John A Smith is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
      Check out our Code of Conduct.





      New contributor





      John A Smith is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
      Check out our Code of Conduct.






      John A Smith is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
      Check out our Code of Conduct.




















          1 Answer
          1






          active

          oldest

          votes


















          0












          $begingroup$

          You start by making the basis of subspace orthogonal (it will make things easier), by using, for example, Gram–Schmidt process. Lucky you, you have already an orthogonal basis, so we just divide it by 2 (to make maths simpler)
          $$
          v_1 = (1,0,0,-1,1),\
          v_2=(0,1,1,0,0),\
          v_3=(1,0,0,1,0).
          $$



          When you have an orthogonal basis, the projection to subspace is just the sum of projections on each basis line:
          $$
          P_v_1v_2v_3(u) = P_v_1(u) + P_v_2(u) + P_v_3(u) = fracucdot v_1v_1^2v_1 + fracucdot v_2v_2^2v_2+fracucdot v_3v_3^2v_3.
          $$



          Finally, you can use the fact that $c(acdot b)=(c otimes b )cdot a$, where $otimes$ is the tensor product:



          $$
          P_v_1v_2v_3(u) = left(fracv_1otimes v_1v_1^2+fracv_2otimes v_2v_2^2+fracv_3otimes v_3v_3^2right)cdot u
          $$



          The thing in parenthesis is your matrix of projection $P$.
          $$
          P=
          frac13beginpmatrix1&0&0&-1&1\0&0&0&0&0\0&0&0&0&0\-1&0&0&1&-1\1&0&0&-1&1endpmatrix +
          frac12beginpmatrix0&0&0&0&0\0&1&1&0&0\0&1&1&0&0\0&0&0&0&0\0&0&0&0&0endpmatrix +
          frac12beginpmatrix1&0&0&1&0\0&0&0&0&0\0&0&0&0&0\1&0&0&1&0\0&0&0&0&0endpmatrix
          $$






          share|cite|improve this answer











          $endgroup$












          • $begingroup$
            Thanks for your reply, I am still a little bit unsure about a few things. When you are projecting the vectors v1, v2, v3 onto the vector u, what is u? Also, I haven't learnt about the tensor product, is there a way to solve this without the use of the tensor product?
            $endgroup$
            – John A Smith
            15 hours ago










          • $begingroup$
            I am projecting $u$ onto $v_1$. $u$ is some arbitrary vector. Tensor product is nothing fancy here, it's just a notation trick. You can write down all your components if you want.
            $endgroup$
            – Vasily Mitch
            15 hours ago











          • $begingroup$
            I am sorry if this sounds dumb, but what exactly is the vector u? Where are you getting it from, it doesn't seem to be an arbitrary variable as you can work out the sum of projections.
            $endgroup$
            – John A Smith
            14 hours ago










          • $begingroup$
            You are trying to find a projection matrix $M$, which is by the definition a matrix, that being multiplied by any vector $u$ gives you its projection $P(u)$. If you take any vector $u$, find expression for $P(u)$ and then express $P(u)=Mu$, then matrix $M$ is your projection matrix
            $endgroup$
            – Vasily Mitch
            14 hours ago










          Your Answer





          StackExchange.ifUsing("editor", function ()
          return StackExchange.using("mathjaxEditing", function ()
          StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
          StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
          );
          );
          , "mathjax-editing");

          StackExchange.ready(function()
          var channelOptions =
          tags: "".split(" "),
          id: "69"
          ;
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function()
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled)
          StackExchange.using("snippets", function()
          createEditor();
          );

          else
          createEditor();

          );

          function createEditor()
          StackExchange.prepareEditor(
          heartbeatType: 'answer',
          autoActivateHeartbeat: false,
          convertImagesToLinks: true,
          noModals: true,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: 10,
          bindNavPrevention: true,
          postfix: "",
          imageUploader:
          brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
          contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
          allowUrls: true
          ,
          noCode: true, onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          );



          );






          John A Smith is a new contributor. Be nice, and check out our Code of Conduct.









          draft saved

          draft discarded


















          StackExchange.ready(
          function ()
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3139994%2fcreating-a-matrix-to-project-vectors-in-5-dimensional-space-to-3-dimensional%23new-answer', 'question_page');

          );

          Post as a guest















          Required, but never shown

























          1 Answer
          1






          active

          oldest

          votes








          1 Answer
          1






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes









          0












          $begingroup$

          You start by making the basis of subspace orthogonal (it will make things easier), by using, for example, Gram–Schmidt process. Lucky you, you have already an orthogonal basis, so we just divide it by 2 (to make maths simpler)
          $$
          v_1 = (1,0,0,-1,1),\
          v_2=(0,1,1,0,0),\
          v_3=(1,0,0,1,0).
          $$



          When you have an orthogonal basis, the projection to subspace is just the sum of projections on each basis line:
          $$
          P_v_1v_2v_3(u) = P_v_1(u) + P_v_2(u) + P_v_3(u) = fracucdot v_1v_1^2v_1 + fracucdot v_2v_2^2v_2+fracucdot v_3v_3^2v_3.
          $$



          Finally, you can use the fact that $c(acdot b)=(c otimes b )cdot a$, where $otimes$ is the tensor product:



          $$
          P_v_1v_2v_3(u) = left(fracv_1otimes v_1v_1^2+fracv_2otimes v_2v_2^2+fracv_3otimes v_3v_3^2right)cdot u
          $$



          The thing in parenthesis is your matrix of projection $P$.
          $$
          P=
          frac13beginpmatrix1&0&0&-1&1\0&0&0&0&0\0&0&0&0&0\-1&0&0&1&-1\1&0&0&-1&1endpmatrix +
          frac12beginpmatrix0&0&0&0&0\0&1&1&0&0\0&1&1&0&0\0&0&0&0&0\0&0&0&0&0endpmatrix +
          frac12beginpmatrix1&0&0&1&0\0&0&0&0&0\0&0&0&0&0\1&0&0&1&0\0&0&0&0&0endpmatrix
          $$






          share|cite|improve this answer











          $endgroup$












          • $begingroup$
            Thanks for your reply, I am still a little bit unsure about a few things. When you are projecting the vectors v1, v2, v3 onto the vector u, what is u? Also, I haven't learnt about the tensor product, is there a way to solve this without the use of the tensor product?
            $endgroup$
            – John A Smith
            15 hours ago










          • $begingroup$
            I am projecting $u$ onto $v_1$. $u$ is some arbitrary vector. Tensor product is nothing fancy here, it's just a notation trick. You can write down all your components if you want.
            $endgroup$
            – Vasily Mitch
            15 hours ago











          • $begingroup$
            I am sorry if this sounds dumb, but what exactly is the vector u? Where are you getting it from, it doesn't seem to be an arbitrary variable as you can work out the sum of projections.
            $endgroup$
            – John A Smith
            14 hours ago










          • $begingroup$
            You are trying to find a projection matrix $M$, which is by the definition a matrix, that being multiplied by any vector $u$ gives you its projection $P(u)$. If you take any vector $u$, find expression for $P(u)$ and then express $P(u)=Mu$, then matrix $M$ is your projection matrix
            $endgroup$
            – Vasily Mitch
            14 hours ago















          0












          $begingroup$

          You start by making the basis of subspace orthogonal (it will make things easier), by using, for example, Gram–Schmidt process. Lucky you, you have already an orthogonal basis, so we just divide it by 2 (to make maths simpler)
          $$
          v_1 = (1,0,0,-1,1),\
          v_2=(0,1,1,0,0),\
          v_3=(1,0,0,1,0).
          $$



          When you have an orthogonal basis, the projection to subspace is just the sum of projections on each basis line:
          $$
          P_v_1v_2v_3(u) = P_v_1(u) + P_v_2(u) + P_v_3(u) = fracucdot v_1v_1^2v_1 + fracucdot v_2v_2^2v_2+fracucdot v_3v_3^2v_3.
          $$



          Finally, you can use the fact that $c(acdot b)=(c otimes b )cdot a$, where $otimes$ is the tensor product:



          $$
          P_v_1v_2v_3(u) = left(fracv_1otimes v_1v_1^2+fracv_2otimes v_2v_2^2+fracv_3otimes v_3v_3^2right)cdot u
          $$



          The thing in parenthesis is your matrix of projection $P$.
          $$
          P=
          frac13beginpmatrix1&0&0&-1&1\0&0&0&0&0\0&0&0&0&0\-1&0&0&1&-1\1&0&0&-1&1endpmatrix +
          frac12beginpmatrix0&0&0&0&0\0&1&1&0&0\0&1&1&0&0\0&0&0&0&0\0&0&0&0&0endpmatrix +
          frac12beginpmatrix1&0&0&1&0\0&0&0&0&0\0&0&0&0&0\1&0&0&1&0\0&0&0&0&0endpmatrix
          $$






          share|cite|improve this answer











          $endgroup$












          • $begingroup$
            Thanks for your reply, I am still a little bit unsure about a few things. When you are projecting the vectors v1, v2, v3 onto the vector u, what is u? Also, I haven't learnt about the tensor product, is there a way to solve this without the use of the tensor product?
            $endgroup$
            – John A Smith
            15 hours ago










          • $begingroup$
            I am projecting $u$ onto $v_1$. $u$ is some arbitrary vector. Tensor product is nothing fancy here, it's just a notation trick. You can write down all your components if you want.
            $endgroup$
            – Vasily Mitch
            15 hours ago











          • $begingroup$
            I am sorry if this sounds dumb, but what exactly is the vector u? Where are you getting it from, it doesn't seem to be an arbitrary variable as you can work out the sum of projections.
            $endgroup$
            – John A Smith
            14 hours ago










          • $begingroup$
            You are trying to find a projection matrix $M$, which is by the definition a matrix, that being multiplied by any vector $u$ gives you its projection $P(u)$. If you take any vector $u$, find expression for $P(u)$ and then express $P(u)=Mu$, then matrix $M$ is your projection matrix
            $endgroup$
            – Vasily Mitch
            14 hours ago













          0












          0








          0





          $begingroup$

          You start by making the basis of subspace orthogonal (it will make things easier), by using, for example, Gram–Schmidt process. Lucky you, you have already an orthogonal basis, so we just divide it by 2 (to make maths simpler)
          $$
          v_1 = (1,0,0,-1,1),\
          v_2=(0,1,1,0,0),\
          v_3=(1,0,0,1,0).
          $$



          When you have an orthogonal basis, the projection to subspace is just the sum of projections on each basis line:
          $$
          P_v_1v_2v_3(u) = P_v_1(u) + P_v_2(u) + P_v_3(u) = fracucdot v_1v_1^2v_1 + fracucdot v_2v_2^2v_2+fracucdot v_3v_3^2v_3.
          $$



          Finally, you can use the fact that $c(acdot b)=(c otimes b )cdot a$, where $otimes$ is the tensor product:



          $$
          P_v_1v_2v_3(u) = left(fracv_1otimes v_1v_1^2+fracv_2otimes v_2v_2^2+fracv_3otimes v_3v_3^2right)cdot u
          $$



          The thing in parenthesis is your matrix of projection $P$.
          $$
          P=
          frac13beginpmatrix1&0&0&-1&1\0&0&0&0&0\0&0&0&0&0\-1&0&0&1&-1\1&0&0&-1&1endpmatrix +
          frac12beginpmatrix0&0&0&0&0\0&1&1&0&0\0&1&1&0&0\0&0&0&0&0\0&0&0&0&0endpmatrix +
          frac12beginpmatrix1&0&0&1&0\0&0&0&0&0\0&0&0&0&0\1&0&0&1&0\0&0&0&0&0endpmatrix
          $$






          share|cite|improve this answer











          $endgroup$



          You start by making the basis of subspace orthogonal (it will make things easier), by using, for example, Gram–Schmidt process. Lucky you, you have already an orthogonal basis, so we just divide it by 2 (to make maths simpler)
          $$
          v_1 = (1,0,0,-1,1),\
          v_2=(0,1,1,0,0),\
          v_3=(1,0,0,1,0).
          $$



          When you have an orthogonal basis, the projection to subspace is just the sum of projections on each basis line:
          $$
          P_v_1v_2v_3(u) = P_v_1(u) + P_v_2(u) + P_v_3(u) = fracucdot v_1v_1^2v_1 + fracucdot v_2v_2^2v_2+fracucdot v_3v_3^2v_3.
          $$



          Finally, you can use the fact that $c(acdot b)=(c otimes b )cdot a$, where $otimes$ is the tensor product:



          $$
          P_v_1v_2v_3(u) = left(fracv_1otimes v_1v_1^2+fracv_2otimes v_2v_2^2+fracv_3otimes v_3v_3^2right)cdot u
          $$



          The thing in parenthesis is your matrix of projection $P$.
          $$
          P=
          frac13beginpmatrix1&0&0&-1&1\0&0&0&0&0\0&0&0&0&0\-1&0&0&1&-1\1&0&0&-1&1endpmatrix +
          frac12beginpmatrix0&0&0&0&0\0&1&1&0&0\0&1&1&0&0\0&0&0&0&0\0&0&0&0&0endpmatrix +
          frac12beginpmatrix1&0&0&1&0\0&0&0&0&0\0&0&0&0&0\1&0&0&1&0\0&0&0&0&0endpmatrix
          $$







          share|cite|improve this answer














          share|cite|improve this answer



          share|cite|improve this answer








          edited 15 hours ago

























          answered 15 hours ago









          Vasily MitchVasily Mitch

          2,3841311




          2,3841311











          • $begingroup$
            Thanks for your reply, I am still a little bit unsure about a few things. When you are projecting the vectors v1, v2, v3 onto the vector u, what is u? Also, I haven't learnt about the tensor product, is there a way to solve this without the use of the tensor product?
            $endgroup$
            – John A Smith
            15 hours ago










          • $begingroup$
            I am projecting $u$ onto $v_1$. $u$ is some arbitrary vector. Tensor product is nothing fancy here, it's just a notation trick. You can write down all your components if you want.
            $endgroup$
            – Vasily Mitch
            15 hours ago











          • $begingroup$
            I am sorry if this sounds dumb, but what exactly is the vector u? Where are you getting it from, it doesn't seem to be an arbitrary variable as you can work out the sum of projections.
            $endgroup$
            – John A Smith
            14 hours ago










          • $begingroup$
            You are trying to find a projection matrix $M$, which is by the definition a matrix, that being multiplied by any vector $u$ gives you its projection $P(u)$. If you take any vector $u$, find expression for $P(u)$ and then express $P(u)=Mu$, then matrix $M$ is your projection matrix
            $endgroup$
            – Vasily Mitch
            14 hours ago
















          • $begingroup$
            Thanks for your reply, I am still a little bit unsure about a few things. When you are projecting the vectors v1, v2, v3 onto the vector u, what is u? Also, I haven't learnt about the tensor product, is there a way to solve this without the use of the tensor product?
            $endgroup$
            – John A Smith
            15 hours ago










          • $begingroup$
            I am projecting $u$ onto $v_1$. $u$ is some arbitrary vector. Tensor product is nothing fancy here, it's just a notation trick. You can write down all your components if you want.
            $endgroup$
            – Vasily Mitch
            15 hours ago











          • $begingroup$
            I am sorry if this sounds dumb, but what exactly is the vector u? Where are you getting it from, it doesn't seem to be an arbitrary variable as you can work out the sum of projections.
            $endgroup$
            – John A Smith
            14 hours ago










          • $begingroup$
            You are trying to find a projection matrix $M$, which is by the definition a matrix, that being multiplied by any vector $u$ gives you its projection $P(u)$. If you take any vector $u$, find expression for $P(u)$ and then express $P(u)=Mu$, then matrix $M$ is your projection matrix
            $endgroup$
            – Vasily Mitch
            14 hours ago















          $begingroup$
          Thanks for your reply, I am still a little bit unsure about a few things. When you are projecting the vectors v1, v2, v3 onto the vector u, what is u? Also, I haven't learnt about the tensor product, is there a way to solve this without the use of the tensor product?
          $endgroup$
          – John A Smith
          15 hours ago




          $begingroup$
          Thanks for your reply, I am still a little bit unsure about a few things. When you are projecting the vectors v1, v2, v3 onto the vector u, what is u? Also, I haven't learnt about the tensor product, is there a way to solve this without the use of the tensor product?
          $endgroup$
          – John A Smith
          15 hours ago












          $begingroup$
          I am projecting $u$ onto $v_1$. $u$ is some arbitrary vector. Tensor product is nothing fancy here, it's just a notation trick. You can write down all your components if you want.
          $endgroup$
          – Vasily Mitch
          15 hours ago





          $begingroup$
          I am projecting $u$ onto $v_1$. $u$ is some arbitrary vector. Tensor product is nothing fancy here, it's just a notation trick. You can write down all your components if you want.
          $endgroup$
          – Vasily Mitch
          15 hours ago













          $begingroup$
          I am sorry if this sounds dumb, but what exactly is the vector u? Where are you getting it from, it doesn't seem to be an arbitrary variable as you can work out the sum of projections.
          $endgroup$
          – John A Smith
          14 hours ago




          $begingroup$
          I am sorry if this sounds dumb, but what exactly is the vector u? Where are you getting it from, it doesn't seem to be an arbitrary variable as you can work out the sum of projections.
          $endgroup$
          – John A Smith
          14 hours ago












          $begingroup$
          You are trying to find a projection matrix $M$, which is by the definition a matrix, that being multiplied by any vector $u$ gives you its projection $P(u)$. If you take any vector $u$, find expression for $P(u)$ and then express $P(u)=Mu$, then matrix $M$ is your projection matrix
          $endgroup$
          – Vasily Mitch
          14 hours ago




          $begingroup$
          You are trying to find a projection matrix $M$, which is by the definition a matrix, that being multiplied by any vector $u$ gives you its projection $P(u)$. If you take any vector $u$, find expression for $P(u)$ and then express $P(u)=Mu$, then matrix $M$ is your projection matrix
          $endgroup$
          – Vasily Mitch
          14 hours ago










          John A Smith is a new contributor. Be nice, and check out our Code of Conduct.









          draft saved

          draft discarded


















          John A Smith is a new contributor. Be nice, and check out our Code of Conduct.












          John A Smith is a new contributor. Be nice, and check out our Code of Conduct.











          John A Smith is a new contributor. Be nice, and check out our Code of Conduct.














          Thanks for contributing an answer to Mathematics Stack Exchange!


          • Please be sure to answer the question. Provide details and share your research!

          But avoid


          • Asking for help, clarification, or responding to other answers.

          • Making statements based on opinion; back them up with references or personal experience.

          Use MathJax to format equations. MathJax reference.


          To learn more, see our tips on writing great answers.




          draft saved


          draft discarded














          StackExchange.ready(
          function ()
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3139994%2fcreating-a-matrix-to-project-vectors-in-5-dimensional-space-to-3-dimensional%23new-answer', 'question_page');

          );

          Post as a guest















          Required, but never shown





















































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown

































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown







          Popular posts from this blog

          How should I support this large drywall patch? Planned maintenance scheduled April 23, 2019 at 00:00UTC (8:00pm US/Eastern) Announcing the arrival of Valued Associate #679: Cesar Manara Unicorn Meta Zoo #1: Why another podcast?How do I cover large gaps in drywall?How do I keep drywall around a patch from crumbling?Can I glue a second layer of drywall?How to patch long strip on drywall?Large drywall patch: how to avoid bulging seams?Drywall Mesh Patch vs. Bulge? To remove or not to remove?How to fix this drywall job?Prep drywall before backsplashWhat's the best way to fix this horrible drywall patch job?Drywall patching using 3M Patch Plus Primer

          random experiment with two different functions on unit interval Announcing the arrival of Valued Associate #679: Cesar Manara Planned maintenance scheduled April 23, 2019 at 00:00UTC (8:00pm US/Eastern)Random variable and probability space notionsRandom Walk with EdgesFinding functions where the increase over a random interval is Poisson distributedNumber of days until dayCan an observed event in fact be of zero probability?Unit random processmodels of coins and uniform distributionHow to get the number of successes given $n$ trials , probability $P$ and a random variable $X$Absorbing Markov chain in a computer. Is “almost every” turned into always convergence in computer executions?Stopped random walk is not uniformly integrable

          Lowndes Grove History Architecture References Navigation menu32°48′6″N 79°57′58″W / 32.80167°N 79.96611°W / 32.80167; -79.9661132°48′6″N 79°57′58″W / 32.80167°N 79.96611°W / 32.80167; -79.9661178002500"National Register Information System"Historic houses of South Carolina"Lowndes Grove""+32° 48' 6.00", −79° 57' 58.00""Lowndes Grove, Charleston County (260 St. Margaret St., Charleston)""Lowndes Grove"The Charleston ExpositionIt Happened in South Carolina"Lowndes Grove (House), Saint Margaret Street & Sixth Avenue, Charleston, Charleston County, SC(Photographs)"Plantations of the Carolina Low Countrye