Creating a matrix to project vectors in $5$-dimensional space to $3$-dimensional sub space through a point?An orthonormal set cannot be a basis in an infinite dimension vector space?How to project a n-dimensional point onto a 2-D subspace?Projection Matrix between two VectorsQuestion about maximal orthonormal subset in infinite dimensional vector spaceUsing my orthonormal basis to find a polynomial that best approximates the polynomial $t^3$infinite dimensional hilbert space - uniqueness of series expansionCreating an orthonormal basisWhat is the maximum number of quadrants in $n$ dimensional space that a $k$ dimensional hyperplane can pass through?Is zero vector always present in any n-dimensional space?In 3D space, how to project a point into x, y and z?
How does learning spells work when leveling a multiclass character?
Short story about cities being connected by a conveyor belt
Is it appropriate to ask a former professor to order a library book for me through ILL?
Having the player face themselves after the mid-game
Use Mercury as quenching liquid for swords?
What is the purpose of a disclaimer like "this is not legal advice"?
Generating a list with duplicate entries
How to distinguish easily different soldier of ww2?
Has a sovereign Communist government ever run, and conceded loss, on a fair election?
Does the US political system, in principle, allow for a no-party system?
How would an energy-based "projectile" blow up a spaceship?
Why do we say 'Pairwise Disjoint', rather than 'Disjoint'?
Interpretation of linear regression interaction term plot
Is the differential, dp, exact or not?
Why restrict private health insurance?
Can I challenge the interviewer to give me a proper technical feedback?
Should we avoid writing fiction about historical events without extensive research?
Are small insurances worth it?
Why aren't there more Gauls like Obelix?
Why do phishing e-mails use faked e-mail addresses instead of the real one?
What is the oldest European royal house?
Help! My Character is too much for her story!
Too soon for a plot twist?
How spaceships determine each other's mass in space?
Creating a matrix to project vectors in $5$-dimensional space to $3$-dimensional sub space through a point?
An orthonormal set cannot be a basis in an infinite dimension vector space?How to project a n-dimensional point onto a 2-D subspace?Projection Matrix between two VectorsQuestion about maximal orthonormal subset in infinite dimensional vector spaceUsing my orthonormal basis to find a polynomial that best approximates the polynomial $t^3$infinite dimensional hilbert space - uniqueness of series expansionCreating an orthonormal basisWhat is the maximum number of quadrants in $n$ dimensional space that a $k$ dimensional hyperplane can pass through?Is zero vector always present in any n-dimensional space?In 3D space, how to project a point into x, y and z?
$begingroup$
I am trying to self teach matrices. I am confused as to how to create a matrix to project vectors in $5$-dimensional space to a $3$-dimensional sub-space that passes through the origin and the points $(2, 0, 0, −2, 2), (0, 2, 2, 0, 0)$ and $(2, 0, 0, 2, 0)$, by using an orthonormal basis.
I understand how to project a vector $V$ onto a vector $W$ ( $fracVcdot WWcdot WW$ ), I have only used this to project $2$d vectors onto other $2$d vectors but I am not sure how to apply it to this scenario ($5$-dimension space to $3$-dimensional space).
I am also unsure as to how to project it through a certain point as given above.
Any help would be very appreciated.
linear-algebra matrices orthonormal
New contributor
$endgroup$
add a comment |
$begingroup$
I am trying to self teach matrices. I am confused as to how to create a matrix to project vectors in $5$-dimensional space to a $3$-dimensional sub-space that passes through the origin and the points $(2, 0, 0, −2, 2), (0, 2, 2, 0, 0)$ and $(2, 0, 0, 2, 0)$, by using an orthonormal basis.
I understand how to project a vector $V$ onto a vector $W$ ( $fracVcdot WWcdot WW$ ), I have only used this to project $2$d vectors onto other $2$d vectors but I am not sure how to apply it to this scenario ($5$-dimension space to $3$-dimensional space).
I am also unsure as to how to project it through a certain point as given above.
Any help would be very appreciated.
linear-algebra matrices orthonormal
New contributor
$endgroup$
add a comment |
$begingroup$
I am trying to self teach matrices. I am confused as to how to create a matrix to project vectors in $5$-dimensional space to a $3$-dimensional sub-space that passes through the origin and the points $(2, 0, 0, −2, 2), (0, 2, 2, 0, 0)$ and $(2, 0, 0, 2, 0)$, by using an orthonormal basis.
I understand how to project a vector $V$ onto a vector $W$ ( $fracVcdot WWcdot WW$ ), I have only used this to project $2$d vectors onto other $2$d vectors but I am not sure how to apply it to this scenario ($5$-dimension space to $3$-dimensional space).
I am also unsure as to how to project it through a certain point as given above.
Any help would be very appreciated.
linear-algebra matrices orthonormal
New contributor
$endgroup$
I am trying to self teach matrices. I am confused as to how to create a matrix to project vectors in $5$-dimensional space to a $3$-dimensional sub-space that passes through the origin and the points $(2, 0, 0, −2, 2), (0, 2, 2, 0, 0)$ and $(2, 0, 0, 2, 0)$, by using an orthonormal basis.
I understand how to project a vector $V$ onto a vector $W$ ( $fracVcdot WWcdot WW$ ), I have only used this to project $2$d vectors onto other $2$d vectors but I am not sure how to apply it to this scenario ($5$-dimension space to $3$-dimensional space).
I am also unsure as to how to project it through a certain point as given above.
Any help would be very appreciated.
linear-algebra matrices orthonormal
linear-algebra matrices orthonormal
New contributor
New contributor
edited 15 hours ago
Daniele Tampieri
2,3972922
2,3972922
New contributor
asked 16 hours ago
John A SmithJohn A Smith
1
1
New contributor
New contributor
add a comment |
add a comment |
1 Answer
1
active
oldest
votes
$begingroup$
You start by making the basis of subspace orthogonal (it will make things easier), by using, for example, Gram–Schmidt process. Lucky you, you have already an orthogonal basis, so we just divide it by 2 (to make maths simpler)
$$
v_1 = (1,0,0,-1,1),\
v_2=(0,1,1,0,0),\
v_3=(1,0,0,1,0).
$$
When you have an orthogonal basis, the projection to subspace is just the sum of projections on each basis line:
$$
P_v_1v_2v_3(u) = P_v_1(u) + P_v_2(u) + P_v_3(u) = fracucdot v_1v_1^2v_1 + fracucdot v_2v_2^2v_2+fracucdot v_3v_3^2v_3.
$$
Finally, you can use the fact that $c(acdot b)=(c otimes b )cdot a$, where $otimes$ is the tensor product:
$$
P_v_1v_2v_3(u) = left(fracv_1otimes v_1v_1^2+fracv_2otimes v_2v_2^2+fracv_3otimes v_3v_3^2right)cdot u
$$
The thing in parenthesis is your matrix of projection $P$.
$$
P=
frac13beginpmatrix1&0&0&-1&1\0&0&0&0&0\0&0&0&0&0\-1&0&0&1&-1\1&0&0&-1&1endpmatrix +
frac12beginpmatrix0&0&0&0&0\0&1&1&0&0\0&1&1&0&0\0&0&0&0&0\0&0&0&0&0endpmatrix +
frac12beginpmatrix1&0&0&1&0\0&0&0&0&0\0&0&0&0&0\1&0&0&1&0\0&0&0&0&0endpmatrix
$$
$endgroup$
$begingroup$
Thanks for your reply, I am still a little bit unsure about a few things. When you are projecting the vectors v1, v2, v3 onto the vector u, what is u? Also, I haven't learnt about the tensor product, is there a way to solve this without the use of the tensor product?
$endgroup$
– John A Smith
15 hours ago
$begingroup$
I am projecting $u$ onto $v_1$. $u$ is some arbitrary vector. Tensor product is nothing fancy here, it's just a notation trick. You can write down all your components if you want.
$endgroup$
– Vasily Mitch
15 hours ago
$begingroup$
I am sorry if this sounds dumb, but what exactly is the vector u? Where are you getting it from, it doesn't seem to be an arbitrary variable as you can work out the sum of projections.
$endgroup$
– John A Smith
14 hours ago
$begingroup$
You are trying to find a projection matrix $M$, which is by the definition a matrix, that being multiplied by any vector $u$ gives you its projection $P(u)$. If you take any vector $u$, find expression for $P(u)$ and then express $P(u)=Mu$, then matrix $M$ is your projection matrix
$endgroup$
– Vasily Mitch
14 hours ago
add a comment |
Your Answer
StackExchange.ifUsing("editor", function ()
return StackExchange.using("mathjaxEditing", function ()
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
);
);
, "mathjax-editing");
StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "69"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);
else
createEditor();
);
function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);
);
John A Smith is a new contributor. Be nice, and check out our Code of Conduct.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3139994%2fcreating-a-matrix-to-project-vectors-in-5-dimensional-space-to-3-dimensional%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
You start by making the basis of subspace orthogonal (it will make things easier), by using, for example, Gram–Schmidt process. Lucky you, you have already an orthogonal basis, so we just divide it by 2 (to make maths simpler)
$$
v_1 = (1,0,0,-1,1),\
v_2=(0,1,1,0,0),\
v_3=(1,0,0,1,0).
$$
When you have an orthogonal basis, the projection to subspace is just the sum of projections on each basis line:
$$
P_v_1v_2v_3(u) = P_v_1(u) + P_v_2(u) + P_v_3(u) = fracucdot v_1v_1^2v_1 + fracucdot v_2v_2^2v_2+fracucdot v_3v_3^2v_3.
$$
Finally, you can use the fact that $c(acdot b)=(c otimes b )cdot a$, where $otimes$ is the tensor product:
$$
P_v_1v_2v_3(u) = left(fracv_1otimes v_1v_1^2+fracv_2otimes v_2v_2^2+fracv_3otimes v_3v_3^2right)cdot u
$$
The thing in parenthesis is your matrix of projection $P$.
$$
P=
frac13beginpmatrix1&0&0&-1&1\0&0&0&0&0\0&0&0&0&0\-1&0&0&1&-1\1&0&0&-1&1endpmatrix +
frac12beginpmatrix0&0&0&0&0\0&1&1&0&0\0&1&1&0&0\0&0&0&0&0\0&0&0&0&0endpmatrix +
frac12beginpmatrix1&0&0&1&0\0&0&0&0&0\0&0&0&0&0\1&0&0&1&0\0&0&0&0&0endpmatrix
$$
$endgroup$
$begingroup$
Thanks for your reply, I am still a little bit unsure about a few things. When you are projecting the vectors v1, v2, v3 onto the vector u, what is u? Also, I haven't learnt about the tensor product, is there a way to solve this without the use of the tensor product?
$endgroup$
– John A Smith
15 hours ago
$begingroup$
I am projecting $u$ onto $v_1$. $u$ is some arbitrary vector. Tensor product is nothing fancy here, it's just a notation trick. You can write down all your components if you want.
$endgroup$
– Vasily Mitch
15 hours ago
$begingroup$
I am sorry if this sounds dumb, but what exactly is the vector u? Where are you getting it from, it doesn't seem to be an arbitrary variable as you can work out the sum of projections.
$endgroup$
– John A Smith
14 hours ago
$begingroup$
You are trying to find a projection matrix $M$, which is by the definition a matrix, that being multiplied by any vector $u$ gives you its projection $P(u)$. If you take any vector $u$, find expression for $P(u)$ and then express $P(u)=Mu$, then matrix $M$ is your projection matrix
$endgroup$
– Vasily Mitch
14 hours ago
add a comment |
$begingroup$
You start by making the basis of subspace orthogonal (it will make things easier), by using, for example, Gram–Schmidt process. Lucky you, you have already an orthogonal basis, so we just divide it by 2 (to make maths simpler)
$$
v_1 = (1,0,0,-1,1),\
v_2=(0,1,1,0,0),\
v_3=(1,0,0,1,0).
$$
When you have an orthogonal basis, the projection to subspace is just the sum of projections on each basis line:
$$
P_v_1v_2v_3(u) = P_v_1(u) + P_v_2(u) + P_v_3(u) = fracucdot v_1v_1^2v_1 + fracucdot v_2v_2^2v_2+fracucdot v_3v_3^2v_3.
$$
Finally, you can use the fact that $c(acdot b)=(c otimes b )cdot a$, where $otimes$ is the tensor product:
$$
P_v_1v_2v_3(u) = left(fracv_1otimes v_1v_1^2+fracv_2otimes v_2v_2^2+fracv_3otimes v_3v_3^2right)cdot u
$$
The thing in parenthesis is your matrix of projection $P$.
$$
P=
frac13beginpmatrix1&0&0&-1&1\0&0&0&0&0\0&0&0&0&0\-1&0&0&1&-1\1&0&0&-1&1endpmatrix +
frac12beginpmatrix0&0&0&0&0\0&1&1&0&0\0&1&1&0&0\0&0&0&0&0\0&0&0&0&0endpmatrix +
frac12beginpmatrix1&0&0&1&0\0&0&0&0&0\0&0&0&0&0\1&0&0&1&0\0&0&0&0&0endpmatrix
$$
$endgroup$
$begingroup$
Thanks for your reply, I am still a little bit unsure about a few things. When you are projecting the vectors v1, v2, v3 onto the vector u, what is u? Also, I haven't learnt about the tensor product, is there a way to solve this without the use of the tensor product?
$endgroup$
– John A Smith
15 hours ago
$begingroup$
I am projecting $u$ onto $v_1$. $u$ is some arbitrary vector. Tensor product is nothing fancy here, it's just a notation trick. You can write down all your components if you want.
$endgroup$
– Vasily Mitch
15 hours ago
$begingroup$
I am sorry if this sounds dumb, but what exactly is the vector u? Where are you getting it from, it doesn't seem to be an arbitrary variable as you can work out the sum of projections.
$endgroup$
– John A Smith
14 hours ago
$begingroup$
You are trying to find a projection matrix $M$, which is by the definition a matrix, that being multiplied by any vector $u$ gives you its projection $P(u)$. If you take any vector $u$, find expression for $P(u)$ and then express $P(u)=Mu$, then matrix $M$ is your projection matrix
$endgroup$
– Vasily Mitch
14 hours ago
add a comment |
$begingroup$
You start by making the basis of subspace orthogonal (it will make things easier), by using, for example, Gram–Schmidt process. Lucky you, you have already an orthogonal basis, so we just divide it by 2 (to make maths simpler)
$$
v_1 = (1,0,0,-1,1),\
v_2=(0,1,1,0,0),\
v_3=(1,0,0,1,0).
$$
When you have an orthogonal basis, the projection to subspace is just the sum of projections on each basis line:
$$
P_v_1v_2v_3(u) = P_v_1(u) + P_v_2(u) + P_v_3(u) = fracucdot v_1v_1^2v_1 + fracucdot v_2v_2^2v_2+fracucdot v_3v_3^2v_3.
$$
Finally, you can use the fact that $c(acdot b)=(c otimes b )cdot a$, where $otimes$ is the tensor product:
$$
P_v_1v_2v_3(u) = left(fracv_1otimes v_1v_1^2+fracv_2otimes v_2v_2^2+fracv_3otimes v_3v_3^2right)cdot u
$$
The thing in parenthesis is your matrix of projection $P$.
$$
P=
frac13beginpmatrix1&0&0&-1&1\0&0&0&0&0\0&0&0&0&0\-1&0&0&1&-1\1&0&0&-1&1endpmatrix +
frac12beginpmatrix0&0&0&0&0\0&1&1&0&0\0&1&1&0&0\0&0&0&0&0\0&0&0&0&0endpmatrix +
frac12beginpmatrix1&0&0&1&0\0&0&0&0&0\0&0&0&0&0\1&0&0&1&0\0&0&0&0&0endpmatrix
$$
$endgroup$
You start by making the basis of subspace orthogonal (it will make things easier), by using, for example, Gram–Schmidt process. Lucky you, you have already an orthogonal basis, so we just divide it by 2 (to make maths simpler)
$$
v_1 = (1,0,0,-1,1),\
v_2=(0,1,1,0,0),\
v_3=(1,0,0,1,0).
$$
When you have an orthogonal basis, the projection to subspace is just the sum of projections on each basis line:
$$
P_v_1v_2v_3(u) = P_v_1(u) + P_v_2(u) + P_v_3(u) = fracucdot v_1v_1^2v_1 + fracucdot v_2v_2^2v_2+fracucdot v_3v_3^2v_3.
$$
Finally, you can use the fact that $c(acdot b)=(c otimes b )cdot a$, where $otimes$ is the tensor product:
$$
P_v_1v_2v_3(u) = left(fracv_1otimes v_1v_1^2+fracv_2otimes v_2v_2^2+fracv_3otimes v_3v_3^2right)cdot u
$$
The thing in parenthesis is your matrix of projection $P$.
$$
P=
frac13beginpmatrix1&0&0&-1&1\0&0&0&0&0\0&0&0&0&0\-1&0&0&1&-1\1&0&0&-1&1endpmatrix +
frac12beginpmatrix0&0&0&0&0\0&1&1&0&0\0&1&1&0&0\0&0&0&0&0\0&0&0&0&0endpmatrix +
frac12beginpmatrix1&0&0&1&0\0&0&0&0&0\0&0&0&0&0\1&0&0&1&0\0&0&0&0&0endpmatrix
$$
edited 15 hours ago
answered 15 hours ago
Vasily MitchVasily Mitch
2,3841311
2,3841311
$begingroup$
Thanks for your reply, I am still a little bit unsure about a few things. When you are projecting the vectors v1, v2, v3 onto the vector u, what is u? Also, I haven't learnt about the tensor product, is there a way to solve this without the use of the tensor product?
$endgroup$
– John A Smith
15 hours ago
$begingroup$
I am projecting $u$ onto $v_1$. $u$ is some arbitrary vector. Tensor product is nothing fancy here, it's just a notation trick. You can write down all your components if you want.
$endgroup$
– Vasily Mitch
15 hours ago
$begingroup$
I am sorry if this sounds dumb, but what exactly is the vector u? Where are you getting it from, it doesn't seem to be an arbitrary variable as you can work out the sum of projections.
$endgroup$
– John A Smith
14 hours ago
$begingroup$
You are trying to find a projection matrix $M$, which is by the definition a matrix, that being multiplied by any vector $u$ gives you its projection $P(u)$. If you take any vector $u$, find expression for $P(u)$ and then express $P(u)=Mu$, then matrix $M$ is your projection matrix
$endgroup$
– Vasily Mitch
14 hours ago
add a comment |
$begingroup$
Thanks for your reply, I am still a little bit unsure about a few things. When you are projecting the vectors v1, v2, v3 onto the vector u, what is u? Also, I haven't learnt about the tensor product, is there a way to solve this without the use of the tensor product?
$endgroup$
– John A Smith
15 hours ago
$begingroup$
I am projecting $u$ onto $v_1$. $u$ is some arbitrary vector. Tensor product is nothing fancy here, it's just a notation trick. You can write down all your components if you want.
$endgroup$
– Vasily Mitch
15 hours ago
$begingroup$
I am sorry if this sounds dumb, but what exactly is the vector u? Where are you getting it from, it doesn't seem to be an arbitrary variable as you can work out the sum of projections.
$endgroup$
– John A Smith
14 hours ago
$begingroup$
You are trying to find a projection matrix $M$, which is by the definition a matrix, that being multiplied by any vector $u$ gives you its projection $P(u)$. If you take any vector $u$, find expression for $P(u)$ and then express $P(u)=Mu$, then matrix $M$ is your projection matrix
$endgroup$
– Vasily Mitch
14 hours ago
$begingroup$
Thanks for your reply, I am still a little bit unsure about a few things. When you are projecting the vectors v1, v2, v3 onto the vector u, what is u? Also, I haven't learnt about the tensor product, is there a way to solve this without the use of the tensor product?
$endgroup$
– John A Smith
15 hours ago
$begingroup$
Thanks for your reply, I am still a little bit unsure about a few things. When you are projecting the vectors v1, v2, v3 onto the vector u, what is u? Also, I haven't learnt about the tensor product, is there a way to solve this without the use of the tensor product?
$endgroup$
– John A Smith
15 hours ago
$begingroup$
I am projecting $u$ onto $v_1$. $u$ is some arbitrary vector. Tensor product is nothing fancy here, it's just a notation trick. You can write down all your components if you want.
$endgroup$
– Vasily Mitch
15 hours ago
$begingroup$
I am projecting $u$ onto $v_1$. $u$ is some arbitrary vector. Tensor product is nothing fancy here, it's just a notation trick. You can write down all your components if you want.
$endgroup$
– Vasily Mitch
15 hours ago
$begingroup$
I am sorry if this sounds dumb, but what exactly is the vector u? Where are you getting it from, it doesn't seem to be an arbitrary variable as you can work out the sum of projections.
$endgroup$
– John A Smith
14 hours ago
$begingroup$
I am sorry if this sounds dumb, but what exactly is the vector u? Where are you getting it from, it doesn't seem to be an arbitrary variable as you can work out the sum of projections.
$endgroup$
– John A Smith
14 hours ago
$begingroup$
You are trying to find a projection matrix $M$, which is by the definition a matrix, that being multiplied by any vector $u$ gives you its projection $P(u)$. If you take any vector $u$, find expression for $P(u)$ and then express $P(u)=Mu$, then matrix $M$ is your projection matrix
$endgroup$
– Vasily Mitch
14 hours ago
$begingroup$
You are trying to find a projection matrix $M$, which is by the definition a matrix, that being multiplied by any vector $u$ gives you its projection $P(u)$. If you take any vector $u$, find expression for $P(u)$ and then express $P(u)=Mu$, then matrix $M$ is your projection matrix
$endgroup$
– Vasily Mitch
14 hours ago
add a comment |
John A Smith is a new contributor. Be nice, and check out our Code of Conduct.
John A Smith is a new contributor. Be nice, and check out our Code of Conduct.
John A Smith is a new contributor. Be nice, and check out our Code of Conduct.
John A Smith is a new contributor. Be nice, and check out our Code of Conduct.
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3139994%2fcreating-a-matrix-to-project-vectors-in-5-dimensional-space-to-3-dimensional%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown