Other inner products for $mathbbR^n$Defining an Inner ProductInner product alternative definitionFinding uniq inner product satisying the following requierments:Prove that there is a unique inner product on $V$Characterisation of inner products preserved by an automorphismThe Redundancy of applying the Gram-Schmidt Procedure on an orthogonal subset of $V$.If complementary subspaces are almost orthogonal, is the same true for their orthogonal complements?Find inner product that meets primary decomposition criteriasan orthogonal map associated with inner productHow to prove Cauchy-Schwarz inequality using multivariable calculus?

Word for a person who has no opinion about whether god exists

Definition of Statistic

Find longest word in a string: are any of these algorithms good?

Does a warlock using the Darkness/Devil's Sight combo still have advantage on ranged attacks against a target outside the Darkness?

How can I get players to stop ignoring or overlooking the plot hooks I'm giving them?

What are some noteworthy "mic-drop" moments in math?

Why the color red for the Republican Party

Are tamper resistant receptacles really safer?

'The literal of type int is out of range' con número enteros pequeños (2 dígitos)

Why is computing ridge regression with a Cholesky decomposition much quicker than using SVD?

How strictly should I take "Candidates must be local"?

If I receive an SOS signal, what is the proper response?

In the late 1940’s to early 1950’s what technology was available that could melt a LOT of ice?

Is "history" a male-biased word ("his+story")?

When traveling to Europe from North America, do I need to purchase a different power strip?

Why doesn't this Google Translate ad use the word "Translation" instead of "Translate"?

Is "conspicuously missing" or "conspicuously" the subject of this sentence?

What is the magic ball of every day?

Reversed Sudoku

Do I really need to have a scientific explanation for my premise?

Declaring and defining template, and specialising them

How do I express some one as a black person?

Is it "Vierergruppe" or "Viergruppe", or is there a distinction?

Accepted offer letter, position changed



Other inner products for $mathbbR^n$


Defining an Inner ProductInner product alternative definitionFinding uniq inner product satisying the following requierments:Prove that there is a unique inner product on $V$Characterisation of inner products preserved by an automorphismThe Redundancy of applying the Gram-Schmidt Procedure on an orthogonal subset of $V$.If complementary subspaces are almost orthogonal, is the same true for their orthogonal complements?Find inner product that meets primary decomposition criteriasan orthogonal map associated with inner productHow to prove Cauchy-Schwarz inequality using multivariable calculus?













1












$begingroup$


For $mathbbR^n$, the standard inner product is the dot product. It is defined as $ langle v,,wrangle = sum_i v_i cdot w_i $. I am aware that any scaled version, namely
$ langle v,,wrangle = sum_ilambda_icdot v_i cdot w_i $ will still satisfy the 4 inner product requirements.



Is there any inner product for $mathbbR^n$ that is not just a scaled version of the standard dot product?



I tried for $mathbbR^2$ with $ langle v,,wrangle = v_1 cdot w_2 + v_2 cdot w_1 $ but that is not positive definite.










share|cite|improve this question











$endgroup$











  • $begingroup$
    So, everyone is saying you can also use a matrix, and that works. Is there a proof though that any inner product can be represented by such a matrix type inner product?
    $endgroup$
    – Ion Sme
    2 days ago










  • $begingroup$
    $newcommandRmathbbRnewcommandumathbfe$Yes. Here is a sketch. Suppose that $(cdot, cdot)$ is some inner product on $R^n$. Let $u_1, ldots,u_n$ be the standard basis for $R^n$ (so $u_k$ is a vector whose $k$-th entry is $1$, and all other entries are $0$). Define $A$ to be the $ntimes n$ matrix with $i,j$ entry $colorbluea_ij= (u_i, u_j)$. Then show that this $A$ satisfies $(mathbfx, mathbfy) = mathbfx^T A mathbfy$ for all $mathbfx,mathbfyin R^n$. EDIT: I see Travis has mentioned this in an answer below.
    $endgroup$
    – Minus One-Twelfth
    2 days ago
















1












$begingroup$


For $mathbbR^n$, the standard inner product is the dot product. It is defined as $ langle v,,wrangle = sum_i v_i cdot w_i $. I am aware that any scaled version, namely
$ langle v,,wrangle = sum_ilambda_icdot v_i cdot w_i $ will still satisfy the 4 inner product requirements.



Is there any inner product for $mathbbR^n$ that is not just a scaled version of the standard dot product?



I tried for $mathbbR^2$ with $ langle v,,wrangle = v_1 cdot w_2 + v_2 cdot w_1 $ but that is not positive definite.










share|cite|improve this question











$endgroup$











  • $begingroup$
    So, everyone is saying you can also use a matrix, and that works. Is there a proof though that any inner product can be represented by such a matrix type inner product?
    $endgroup$
    – Ion Sme
    2 days ago










  • $begingroup$
    $newcommandRmathbbRnewcommandumathbfe$Yes. Here is a sketch. Suppose that $(cdot, cdot)$ is some inner product on $R^n$. Let $u_1, ldots,u_n$ be the standard basis for $R^n$ (so $u_k$ is a vector whose $k$-th entry is $1$, and all other entries are $0$). Define $A$ to be the $ntimes n$ matrix with $i,j$ entry $colorbluea_ij= (u_i, u_j)$. Then show that this $A$ satisfies $(mathbfx, mathbfy) = mathbfx^T A mathbfy$ for all $mathbfx,mathbfyin R^n$. EDIT: I see Travis has mentioned this in an answer below.
    $endgroup$
    – Minus One-Twelfth
    2 days ago














1












1








1


1



$begingroup$


For $mathbbR^n$, the standard inner product is the dot product. It is defined as $ langle v,,wrangle = sum_i v_i cdot w_i $. I am aware that any scaled version, namely
$ langle v,,wrangle = sum_ilambda_icdot v_i cdot w_i $ will still satisfy the 4 inner product requirements.



Is there any inner product for $mathbbR^n$ that is not just a scaled version of the standard dot product?



I tried for $mathbbR^2$ with $ langle v,,wrangle = v_1 cdot w_2 + v_2 cdot w_1 $ but that is not positive definite.










share|cite|improve this question











$endgroup$




For $mathbbR^n$, the standard inner product is the dot product. It is defined as $ langle v,,wrangle = sum_i v_i cdot w_i $. I am aware that any scaled version, namely
$ langle v,,wrangle = sum_ilambda_icdot v_i cdot w_i $ will still satisfy the 4 inner product requirements.



Is there any inner product for $mathbbR^n$ that is not just a scaled version of the standard dot product?



I tried for $mathbbR^2$ with $ langle v,,wrangle = v_1 cdot w_2 + v_2 cdot w_1 $ but that is not positive definite.







linear-algebra inner-product-space






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited 2 days ago









J.G.

29.6k22946




29.6k22946










asked 2 days ago









Ion SmeIon Sme

11510




11510











  • $begingroup$
    So, everyone is saying you can also use a matrix, and that works. Is there a proof though that any inner product can be represented by such a matrix type inner product?
    $endgroup$
    – Ion Sme
    2 days ago










  • $begingroup$
    $newcommandRmathbbRnewcommandumathbfe$Yes. Here is a sketch. Suppose that $(cdot, cdot)$ is some inner product on $R^n$. Let $u_1, ldots,u_n$ be the standard basis for $R^n$ (so $u_k$ is a vector whose $k$-th entry is $1$, and all other entries are $0$). Define $A$ to be the $ntimes n$ matrix with $i,j$ entry $colorbluea_ij= (u_i, u_j)$. Then show that this $A$ satisfies $(mathbfx, mathbfy) = mathbfx^T A mathbfy$ for all $mathbfx,mathbfyin R^n$. EDIT: I see Travis has mentioned this in an answer below.
    $endgroup$
    – Minus One-Twelfth
    2 days ago

















  • $begingroup$
    So, everyone is saying you can also use a matrix, and that works. Is there a proof though that any inner product can be represented by such a matrix type inner product?
    $endgroup$
    – Ion Sme
    2 days ago










  • $begingroup$
    $newcommandRmathbbRnewcommandumathbfe$Yes. Here is a sketch. Suppose that $(cdot, cdot)$ is some inner product on $R^n$. Let $u_1, ldots,u_n$ be the standard basis for $R^n$ (so $u_k$ is a vector whose $k$-th entry is $1$, and all other entries are $0$). Define $A$ to be the $ntimes n$ matrix with $i,j$ entry $colorbluea_ij= (u_i, u_j)$. Then show that this $A$ satisfies $(mathbfx, mathbfy) = mathbfx^T A mathbfy$ for all $mathbfx,mathbfyin R^n$. EDIT: I see Travis has mentioned this in an answer below.
    $endgroup$
    – Minus One-Twelfth
    2 days ago
















$begingroup$
So, everyone is saying you can also use a matrix, and that works. Is there a proof though that any inner product can be represented by such a matrix type inner product?
$endgroup$
– Ion Sme
2 days ago




$begingroup$
So, everyone is saying you can also use a matrix, and that works. Is there a proof though that any inner product can be represented by such a matrix type inner product?
$endgroup$
– Ion Sme
2 days ago












$begingroup$
$newcommandRmathbbRnewcommandumathbfe$Yes. Here is a sketch. Suppose that $(cdot, cdot)$ is some inner product on $R^n$. Let $u_1, ldots,u_n$ be the standard basis for $R^n$ (so $u_k$ is a vector whose $k$-th entry is $1$, and all other entries are $0$). Define $A$ to be the $ntimes n$ matrix with $i,j$ entry $colorbluea_ij= (u_i, u_j)$. Then show that this $A$ satisfies $(mathbfx, mathbfy) = mathbfx^T A mathbfy$ for all $mathbfx,mathbfyin R^n$. EDIT: I see Travis has mentioned this in an answer below.
$endgroup$
– Minus One-Twelfth
2 days ago





$begingroup$
$newcommandRmathbbRnewcommandumathbfe$Yes. Here is a sketch. Suppose that $(cdot, cdot)$ is some inner product on $R^n$. Let $u_1, ldots,u_n$ be the standard basis for $R^n$ (so $u_k$ is a vector whose $k$-th entry is $1$, and all other entries are $0$). Define $A$ to be the $ntimes n$ matrix with $i,j$ entry $colorbluea_ij= (u_i, u_j)$. Then show that this $A$ satisfies $(mathbfx, mathbfy) = mathbfx^T A mathbfy$ for all $mathbfx,mathbfyin R^n$. EDIT: I see Travis has mentioned this in an answer below.
$endgroup$
– Minus One-Twelfth
2 days ago











7 Answers
7






active

oldest

votes


















2












$begingroup$

Yes. Given any positive definite $n times n$ matrix $A$, $$bf x cdot bf y := bf y^top A bf x$$
defines an inner product on $Bbb R^n$, and for $n > 1$ there are nondiagonal positive definite matrices, for example,
$$pmatrix1&epsilon\epsilon&1\&&1\&&&ddots\&&&&1$$
for any $0 < |epsilon| < 1$.



Conversely all inner products arise this way, as we can recover $A$ by setting $$A_ij = bf e_i cdot bf e_j$$ for the standard basis $(bf e_i)$.



On the other hand, given any inner product on $Bbb R^n$, applying the Gram-Schmidt Process produces an orthonormal basis $(bf f_i)$, so the matrix representation of the inner product with respect to that basis is the identity matrix, $I_n$. In this sense, all inner products on $Bbb R^n$ are equivalent.






share|cite|improve this answer











$endgroup$








  • 1




    $begingroup$
    I think that if OP knew what a positive definite symmetric matrix is, there would be no need to pose the question as it is.
    $endgroup$
    – Marc van Leeuwen
    yesterday










  • $begingroup$
    @MarcvanLeeuwen Thanks for the comment. And yes, I agree, but one needs that language in order to state the result precisely. Still, that probably means OP would benefit from a concrete example, which I've added to the answer.
    $endgroup$
    – Travis
    yesterday



















1












$begingroup$

For any invertible linear transformation $A$ you can define the inner product $langle v,wrangle_A=langle Av,Awrangle$ where $langlecdot,cdotrangle$ denotes the standard inner product. I expect there are no other inner products, which is motivated by the fact that all inner products are known to induce equivalent norms.






share|cite|improve this answer









$endgroup$




















    1












    $begingroup$

    Technically, you need positive $lambda_i$. Or if we use $sum_ijlambda_ijv_iw_j$, the matrix $lambda$ is without loss of generality equal to $(lambda+lambda^T)/2$, and it has to be positive-definite. (Yes, this matrix property has the same name; it basically means it has only positive eigenvalues.) With an appropriate basis change we can then diagonalize this matrix, which recovers the case you knew about. As for the example you tried, it failed because if you work out the matrix $lambda=left(beginarraycc
    0 & 1\
    1 & 0
    endarrayright)$
    (once we make it self-adjoint as explained above), which has $-1$ as an eigenvalue.






    share|cite|improve this answer











    $endgroup$












    • $begingroup$
      Where does the condition $lambda = (lambda+lambda^T)/2$ come from?
      $endgroup$
      – Ion Sme
      2 days ago











    • $begingroup$
      @IonSme As I said, it's imposed without loss of generality. If you replace $lambda$ as thus, the resulting inner product won't be changed. (This is obvious in the case $v=w$, but in fact this is exhaustive because norms determine an inner product viz. $langle v,, wrangle = (Vert v+wVert^2-Vert v-wVert^2)/4$.)
      $endgroup$
      – J.G.
      2 days ago



















    0












    $begingroup$

    Inner products $p(x,y)$ on $mathbb R^n$ have the form
    $$
    p(x,y) = sum_j=1^n sum_k=1^n a_jk x_j y_k
    $$

    where the matrix $A = [a_jk]$ is positive definite. Choosing a basis of eigenvectors for the matrix $A$, and expanding according to this new basis instead of the original basis, the inner product is then diagonal:
    $$
    p(x,y) = sum_j=1^n b_j x_j y_j
    $$

    where $b_j>0$.






    share|cite|improve this answer









    $endgroup$




















      0












      $begingroup$

      I agree with SmileyCraft. In finite dimensional vector spaces, bilinear transformations, as linear transformations, can be written in terms of the values that they adopt in a given base:$$left langle x,y right rangle=sum_i,j=1^nx_iy_jleft langle e_i,e_j right rangle.$$
      I believe you can arrive in this representation without difficult, proving then you suspicion.






      share|cite|improve this answer








      New contributor




      Jonathan Honório is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
      Check out our Code of Conduct.






      $endgroup$




















        0












        $begingroup$

        It is well known (and easy to prove) that any two finite dimensional inner product spaces are isometrically isomorphic. Hence $ langle x, y rangle'$ is an inner product on $mathbb R^n$ iff there is a vector space isomorphism $T: mathbb R^n to mathbb R^n$ such that $langle x, y rangle' =langle Tx, Ty rangle$ for all $x,y$.






        share|cite|improve this answer









        $endgroup$




















          0












          $begingroup$

          In general, for any $ntimes n$ matrix $A=(a_i,j)_i,j=1,ldots,n$ the expression $sum_i,j=1^na_i,jx_iy_j$ defines a bilinear form, which will be symmetric if and only $A$ is. Giving a positive definite symmetric bilinear form is a more subtle condition that leads to inequalities for the coefficients of $A$ (and the matrices that satisfy the condition are naturally called positive definite). For $2times2$ symmetric matrices the positive definite condition is $a_1,1>0$, $a_2,2>0$ together with $a_1,1a_2,2-a_1,2^2>0$ (so $det(A)>0$). For a concrete example, the symmetric matrix
          $$
          A=pmatrix1&frac12\frac12&1
          quadtextgives an inner product with
          langle v,,wrangle = v_1w_1 +frac12(v_1w_2+v_2w_1) + v_2w_2,.
          $$



          In higher dimension the condition is more complicated, but in any case one does get many different inner products on $Bbb R^n$ in this way. They do turn out to be all equivalent in the sense that they give rise to the same structure theory, but they are not equal.






          share|cite|improve this answer











          $endgroup$












            Your Answer





            StackExchange.ifUsing("editor", function ()
            return StackExchange.using("mathjaxEditing", function ()
            StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
            StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
            );
            );
            , "mathjax-editing");

            StackExchange.ready(function()
            var channelOptions =
            tags: "".split(" "),
            id: "69"
            ;
            initTagRenderer("".split(" "), "".split(" "), channelOptions);

            StackExchange.using("externalEditor", function()
            // Have to fire editor after snippets, if snippets enabled
            if (StackExchange.settings.snippets.snippetsEnabled)
            StackExchange.using("snippets", function()
            createEditor();
            );

            else
            createEditor();

            );

            function createEditor()
            StackExchange.prepareEditor(
            heartbeatType: 'answer',
            autoActivateHeartbeat: false,
            convertImagesToLinks: true,
            noModals: true,
            showLowRepImageUploadWarning: true,
            reputationToPostImages: 10,
            bindNavPrevention: true,
            postfix: "",
            imageUploader:
            brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
            contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
            allowUrls: true
            ,
            noCode: true, onDemand: true,
            discardSelector: ".discard-answer"
            ,immediatelyShowMarkdownHelp:true
            );



            );













            draft saved

            draft discarded


















            StackExchange.ready(
            function ()
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3141609%2fother-inner-products-for-mathbbrn%23new-answer', 'question_page');

            );

            Post as a guest















            Required, but never shown

























            7 Answers
            7






            active

            oldest

            votes








            7 Answers
            7






            active

            oldest

            votes









            active

            oldest

            votes






            active

            oldest

            votes









            2












            $begingroup$

            Yes. Given any positive definite $n times n$ matrix $A$, $$bf x cdot bf y := bf y^top A bf x$$
            defines an inner product on $Bbb R^n$, and for $n > 1$ there are nondiagonal positive definite matrices, for example,
            $$pmatrix1&epsilon\epsilon&1\&&1\&&&ddots\&&&&1$$
            for any $0 < |epsilon| < 1$.



            Conversely all inner products arise this way, as we can recover $A$ by setting $$A_ij = bf e_i cdot bf e_j$$ for the standard basis $(bf e_i)$.



            On the other hand, given any inner product on $Bbb R^n$, applying the Gram-Schmidt Process produces an orthonormal basis $(bf f_i)$, so the matrix representation of the inner product with respect to that basis is the identity matrix, $I_n$. In this sense, all inner products on $Bbb R^n$ are equivalent.






            share|cite|improve this answer











            $endgroup$








            • 1




              $begingroup$
              I think that if OP knew what a positive definite symmetric matrix is, there would be no need to pose the question as it is.
              $endgroup$
              – Marc van Leeuwen
              yesterday










            • $begingroup$
              @MarcvanLeeuwen Thanks for the comment. And yes, I agree, but one needs that language in order to state the result precisely. Still, that probably means OP would benefit from a concrete example, which I've added to the answer.
              $endgroup$
              – Travis
              yesterday
















            2












            $begingroup$

            Yes. Given any positive definite $n times n$ matrix $A$, $$bf x cdot bf y := bf y^top A bf x$$
            defines an inner product on $Bbb R^n$, and for $n > 1$ there are nondiagonal positive definite matrices, for example,
            $$pmatrix1&epsilon\epsilon&1\&&1\&&&ddots\&&&&1$$
            for any $0 < |epsilon| < 1$.



            Conversely all inner products arise this way, as we can recover $A$ by setting $$A_ij = bf e_i cdot bf e_j$$ for the standard basis $(bf e_i)$.



            On the other hand, given any inner product on $Bbb R^n$, applying the Gram-Schmidt Process produces an orthonormal basis $(bf f_i)$, so the matrix representation of the inner product with respect to that basis is the identity matrix, $I_n$. In this sense, all inner products on $Bbb R^n$ are equivalent.






            share|cite|improve this answer











            $endgroup$








            • 1




              $begingroup$
              I think that if OP knew what a positive definite symmetric matrix is, there would be no need to pose the question as it is.
              $endgroup$
              – Marc van Leeuwen
              yesterday










            • $begingroup$
              @MarcvanLeeuwen Thanks for the comment. And yes, I agree, but one needs that language in order to state the result precisely. Still, that probably means OP would benefit from a concrete example, which I've added to the answer.
              $endgroup$
              – Travis
              yesterday














            2












            2








            2





            $begingroup$

            Yes. Given any positive definite $n times n$ matrix $A$, $$bf x cdot bf y := bf y^top A bf x$$
            defines an inner product on $Bbb R^n$, and for $n > 1$ there are nondiagonal positive definite matrices, for example,
            $$pmatrix1&epsilon\epsilon&1\&&1\&&&ddots\&&&&1$$
            for any $0 < |epsilon| < 1$.



            Conversely all inner products arise this way, as we can recover $A$ by setting $$A_ij = bf e_i cdot bf e_j$$ for the standard basis $(bf e_i)$.



            On the other hand, given any inner product on $Bbb R^n$, applying the Gram-Schmidt Process produces an orthonormal basis $(bf f_i)$, so the matrix representation of the inner product with respect to that basis is the identity matrix, $I_n$. In this sense, all inner products on $Bbb R^n$ are equivalent.






            share|cite|improve this answer











            $endgroup$



            Yes. Given any positive definite $n times n$ matrix $A$, $$bf x cdot bf y := bf y^top A bf x$$
            defines an inner product on $Bbb R^n$, and for $n > 1$ there are nondiagonal positive definite matrices, for example,
            $$pmatrix1&epsilon\epsilon&1\&&1\&&&ddots\&&&&1$$
            for any $0 < |epsilon| < 1$.



            Conversely all inner products arise this way, as we can recover $A$ by setting $$A_ij = bf e_i cdot bf e_j$$ for the standard basis $(bf e_i)$.



            On the other hand, given any inner product on $Bbb R^n$, applying the Gram-Schmidt Process produces an orthonormal basis $(bf f_i)$, so the matrix representation of the inner product with respect to that basis is the identity matrix, $I_n$. In this sense, all inner products on $Bbb R^n$ are equivalent.







            share|cite|improve this answer














            share|cite|improve this answer



            share|cite|improve this answer








            edited yesterday

























            answered 2 days ago









            TravisTravis

            63k767150




            63k767150







            • 1




              $begingroup$
              I think that if OP knew what a positive definite symmetric matrix is, there would be no need to pose the question as it is.
              $endgroup$
              – Marc van Leeuwen
              yesterday










            • $begingroup$
              @MarcvanLeeuwen Thanks for the comment. And yes, I agree, but one needs that language in order to state the result precisely. Still, that probably means OP would benefit from a concrete example, which I've added to the answer.
              $endgroup$
              – Travis
              yesterday













            • 1




              $begingroup$
              I think that if OP knew what a positive definite symmetric matrix is, there would be no need to pose the question as it is.
              $endgroup$
              – Marc van Leeuwen
              yesterday










            • $begingroup$
              @MarcvanLeeuwen Thanks for the comment. And yes, I agree, but one needs that language in order to state the result precisely. Still, that probably means OP would benefit from a concrete example, which I've added to the answer.
              $endgroup$
              – Travis
              yesterday








            1




            1




            $begingroup$
            I think that if OP knew what a positive definite symmetric matrix is, there would be no need to pose the question as it is.
            $endgroup$
            – Marc van Leeuwen
            yesterday




            $begingroup$
            I think that if OP knew what a positive definite symmetric matrix is, there would be no need to pose the question as it is.
            $endgroup$
            – Marc van Leeuwen
            yesterday












            $begingroup$
            @MarcvanLeeuwen Thanks for the comment. And yes, I agree, but one needs that language in order to state the result precisely. Still, that probably means OP would benefit from a concrete example, which I've added to the answer.
            $endgroup$
            – Travis
            yesterday





            $begingroup$
            @MarcvanLeeuwen Thanks for the comment. And yes, I agree, but one needs that language in order to state the result precisely. Still, that probably means OP would benefit from a concrete example, which I've added to the answer.
            $endgroup$
            – Travis
            yesterday












            1












            $begingroup$

            For any invertible linear transformation $A$ you can define the inner product $langle v,wrangle_A=langle Av,Awrangle$ where $langlecdot,cdotrangle$ denotes the standard inner product. I expect there are no other inner products, which is motivated by the fact that all inner products are known to induce equivalent norms.






            share|cite|improve this answer









            $endgroup$

















              1












              $begingroup$

              For any invertible linear transformation $A$ you can define the inner product $langle v,wrangle_A=langle Av,Awrangle$ where $langlecdot,cdotrangle$ denotes the standard inner product. I expect there are no other inner products, which is motivated by the fact that all inner products are known to induce equivalent norms.






              share|cite|improve this answer









              $endgroup$















                1












                1








                1





                $begingroup$

                For any invertible linear transformation $A$ you can define the inner product $langle v,wrangle_A=langle Av,Awrangle$ where $langlecdot,cdotrangle$ denotes the standard inner product. I expect there are no other inner products, which is motivated by the fact that all inner products are known to induce equivalent norms.






                share|cite|improve this answer









                $endgroup$



                For any invertible linear transformation $A$ you can define the inner product $langle v,wrangle_A=langle Av,Awrangle$ where $langlecdot,cdotrangle$ denotes the standard inner product. I expect there are no other inner products, which is motivated by the fact that all inner products are known to induce equivalent norms.







                share|cite|improve this answer












                share|cite|improve this answer



                share|cite|improve this answer










                answered 2 days ago









                SmileyCraftSmileyCraft

                3,626519




                3,626519





















                    1












                    $begingroup$

                    Technically, you need positive $lambda_i$. Or if we use $sum_ijlambda_ijv_iw_j$, the matrix $lambda$ is without loss of generality equal to $(lambda+lambda^T)/2$, and it has to be positive-definite. (Yes, this matrix property has the same name; it basically means it has only positive eigenvalues.) With an appropriate basis change we can then diagonalize this matrix, which recovers the case you knew about. As for the example you tried, it failed because if you work out the matrix $lambda=left(beginarraycc
                    0 & 1\
                    1 & 0
                    endarrayright)$
                    (once we make it self-adjoint as explained above), which has $-1$ as an eigenvalue.






                    share|cite|improve this answer











                    $endgroup$












                    • $begingroup$
                      Where does the condition $lambda = (lambda+lambda^T)/2$ come from?
                      $endgroup$
                      – Ion Sme
                      2 days ago











                    • $begingroup$
                      @IonSme As I said, it's imposed without loss of generality. If you replace $lambda$ as thus, the resulting inner product won't be changed. (This is obvious in the case $v=w$, but in fact this is exhaustive because norms determine an inner product viz. $langle v,, wrangle = (Vert v+wVert^2-Vert v-wVert^2)/4$.)
                      $endgroup$
                      – J.G.
                      2 days ago
















                    1












                    $begingroup$

                    Technically, you need positive $lambda_i$. Or if we use $sum_ijlambda_ijv_iw_j$, the matrix $lambda$ is without loss of generality equal to $(lambda+lambda^T)/2$, and it has to be positive-definite. (Yes, this matrix property has the same name; it basically means it has only positive eigenvalues.) With an appropriate basis change we can then diagonalize this matrix, which recovers the case you knew about. As for the example you tried, it failed because if you work out the matrix $lambda=left(beginarraycc
                    0 & 1\
                    1 & 0
                    endarrayright)$
                    (once we make it self-adjoint as explained above), which has $-1$ as an eigenvalue.






                    share|cite|improve this answer











                    $endgroup$












                    • $begingroup$
                      Where does the condition $lambda = (lambda+lambda^T)/2$ come from?
                      $endgroup$
                      – Ion Sme
                      2 days ago











                    • $begingroup$
                      @IonSme As I said, it's imposed without loss of generality. If you replace $lambda$ as thus, the resulting inner product won't be changed. (This is obvious in the case $v=w$, but in fact this is exhaustive because norms determine an inner product viz. $langle v,, wrangle = (Vert v+wVert^2-Vert v-wVert^2)/4$.)
                      $endgroup$
                      – J.G.
                      2 days ago














                    1












                    1








                    1





                    $begingroup$

                    Technically, you need positive $lambda_i$. Or if we use $sum_ijlambda_ijv_iw_j$, the matrix $lambda$ is without loss of generality equal to $(lambda+lambda^T)/2$, and it has to be positive-definite. (Yes, this matrix property has the same name; it basically means it has only positive eigenvalues.) With an appropriate basis change we can then diagonalize this matrix, which recovers the case you knew about. As for the example you tried, it failed because if you work out the matrix $lambda=left(beginarraycc
                    0 & 1\
                    1 & 0
                    endarrayright)$
                    (once we make it self-adjoint as explained above), which has $-1$ as an eigenvalue.






                    share|cite|improve this answer











                    $endgroup$



                    Technically, you need positive $lambda_i$. Or if we use $sum_ijlambda_ijv_iw_j$, the matrix $lambda$ is without loss of generality equal to $(lambda+lambda^T)/2$, and it has to be positive-definite. (Yes, this matrix property has the same name; it basically means it has only positive eigenvalues.) With an appropriate basis change we can then diagonalize this matrix, which recovers the case you knew about. As for the example you tried, it failed because if you work out the matrix $lambda=left(beginarraycc
                    0 & 1\
                    1 & 0
                    endarrayright)$
                    (once we make it self-adjoint as explained above), which has $-1$ as an eigenvalue.







                    share|cite|improve this answer














                    share|cite|improve this answer



                    share|cite|improve this answer








                    edited yesterday

























                    answered 2 days ago









                    J.G.J.G.

                    29.6k22946




                    29.6k22946











                    • $begingroup$
                      Where does the condition $lambda = (lambda+lambda^T)/2$ come from?
                      $endgroup$
                      – Ion Sme
                      2 days ago











                    • $begingroup$
                      @IonSme As I said, it's imposed without loss of generality. If you replace $lambda$ as thus, the resulting inner product won't be changed. (This is obvious in the case $v=w$, but in fact this is exhaustive because norms determine an inner product viz. $langle v,, wrangle = (Vert v+wVert^2-Vert v-wVert^2)/4$.)
                      $endgroup$
                      – J.G.
                      2 days ago

















                    • $begingroup$
                      Where does the condition $lambda = (lambda+lambda^T)/2$ come from?
                      $endgroup$
                      – Ion Sme
                      2 days ago











                    • $begingroup$
                      @IonSme As I said, it's imposed without loss of generality. If you replace $lambda$ as thus, the resulting inner product won't be changed. (This is obvious in the case $v=w$, but in fact this is exhaustive because norms determine an inner product viz. $langle v,, wrangle = (Vert v+wVert^2-Vert v-wVert^2)/4$.)
                      $endgroup$
                      – J.G.
                      2 days ago
















                    $begingroup$
                    Where does the condition $lambda = (lambda+lambda^T)/2$ come from?
                    $endgroup$
                    – Ion Sme
                    2 days ago





                    $begingroup$
                    Where does the condition $lambda = (lambda+lambda^T)/2$ come from?
                    $endgroup$
                    – Ion Sme
                    2 days ago













                    $begingroup$
                    @IonSme As I said, it's imposed without loss of generality. If you replace $lambda$ as thus, the resulting inner product won't be changed. (This is obvious in the case $v=w$, but in fact this is exhaustive because norms determine an inner product viz. $langle v,, wrangle = (Vert v+wVert^2-Vert v-wVert^2)/4$.)
                    $endgroup$
                    – J.G.
                    2 days ago





                    $begingroup$
                    @IonSme As I said, it's imposed without loss of generality. If you replace $lambda$ as thus, the resulting inner product won't be changed. (This is obvious in the case $v=w$, but in fact this is exhaustive because norms determine an inner product viz. $langle v,, wrangle = (Vert v+wVert^2-Vert v-wVert^2)/4$.)
                    $endgroup$
                    – J.G.
                    2 days ago












                    0












                    $begingroup$

                    Inner products $p(x,y)$ on $mathbb R^n$ have the form
                    $$
                    p(x,y) = sum_j=1^n sum_k=1^n a_jk x_j y_k
                    $$

                    where the matrix $A = [a_jk]$ is positive definite. Choosing a basis of eigenvectors for the matrix $A$, and expanding according to this new basis instead of the original basis, the inner product is then diagonal:
                    $$
                    p(x,y) = sum_j=1^n b_j x_j y_j
                    $$

                    where $b_j>0$.






                    share|cite|improve this answer









                    $endgroup$

















                      0












                      $begingroup$

                      Inner products $p(x,y)$ on $mathbb R^n$ have the form
                      $$
                      p(x,y) = sum_j=1^n sum_k=1^n a_jk x_j y_k
                      $$

                      where the matrix $A = [a_jk]$ is positive definite. Choosing a basis of eigenvectors for the matrix $A$, and expanding according to this new basis instead of the original basis, the inner product is then diagonal:
                      $$
                      p(x,y) = sum_j=1^n b_j x_j y_j
                      $$

                      where $b_j>0$.






                      share|cite|improve this answer









                      $endgroup$















                        0












                        0








                        0





                        $begingroup$

                        Inner products $p(x,y)$ on $mathbb R^n$ have the form
                        $$
                        p(x,y) = sum_j=1^n sum_k=1^n a_jk x_j y_k
                        $$

                        where the matrix $A = [a_jk]$ is positive definite. Choosing a basis of eigenvectors for the matrix $A$, and expanding according to this new basis instead of the original basis, the inner product is then diagonal:
                        $$
                        p(x,y) = sum_j=1^n b_j x_j y_j
                        $$

                        where $b_j>0$.






                        share|cite|improve this answer









                        $endgroup$



                        Inner products $p(x,y)$ on $mathbb R^n$ have the form
                        $$
                        p(x,y) = sum_j=1^n sum_k=1^n a_jk x_j y_k
                        $$

                        where the matrix $A = [a_jk]$ is positive definite. Choosing a basis of eigenvectors for the matrix $A$, and expanding according to this new basis instead of the original basis, the inner product is then diagonal:
                        $$
                        p(x,y) = sum_j=1^n b_j x_j y_j
                        $$

                        where $b_j>0$.







                        share|cite|improve this answer












                        share|cite|improve this answer



                        share|cite|improve this answer










                        answered 2 days ago









                        GEdgarGEdgar

                        62.9k267171




                        62.9k267171





















                            0












                            $begingroup$

                            I agree with SmileyCraft. In finite dimensional vector spaces, bilinear transformations, as linear transformations, can be written in terms of the values that they adopt in a given base:$$left langle x,y right rangle=sum_i,j=1^nx_iy_jleft langle e_i,e_j right rangle.$$
                            I believe you can arrive in this representation without difficult, proving then you suspicion.






                            share|cite|improve this answer








                            New contributor




                            Jonathan Honório is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
                            Check out our Code of Conduct.






                            $endgroup$

















                              0












                              $begingroup$

                              I agree with SmileyCraft. In finite dimensional vector spaces, bilinear transformations, as linear transformations, can be written in terms of the values that they adopt in a given base:$$left langle x,y right rangle=sum_i,j=1^nx_iy_jleft langle e_i,e_j right rangle.$$
                              I believe you can arrive in this representation without difficult, proving then you suspicion.






                              share|cite|improve this answer








                              New contributor




                              Jonathan Honório is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
                              Check out our Code of Conduct.






                              $endgroup$















                                0












                                0








                                0





                                $begingroup$

                                I agree with SmileyCraft. In finite dimensional vector spaces, bilinear transformations, as linear transformations, can be written in terms of the values that they adopt in a given base:$$left langle x,y right rangle=sum_i,j=1^nx_iy_jleft langle e_i,e_j right rangle.$$
                                I believe you can arrive in this representation without difficult, proving then you suspicion.






                                share|cite|improve this answer








                                New contributor




                                Jonathan Honório is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
                                Check out our Code of Conduct.






                                $endgroup$



                                I agree with SmileyCraft. In finite dimensional vector spaces, bilinear transformations, as linear transformations, can be written in terms of the values that they adopt in a given base:$$left langle x,y right rangle=sum_i,j=1^nx_iy_jleft langle e_i,e_j right rangle.$$
                                I believe you can arrive in this representation without difficult, proving then you suspicion.







                                share|cite|improve this answer








                                New contributor




                                Jonathan Honório is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
                                Check out our Code of Conduct.









                                share|cite|improve this answer



                                share|cite|improve this answer






                                New contributor




                                Jonathan Honório is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
                                Check out our Code of Conduct.









                                answered 2 days ago









                                Jonathan HonórioJonathan Honório

                                113




                                113




                                New contributor




                                Jonathan Honório is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
                                Check out our Code of Conduct.





                                New contributor





                                Jonathan Honório is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
                                Check out our Code of Conduct.






                                Jonathan Honório is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
                                Check out our Code of Conduct.





















                                    0












                                    $begingroup$

                                    It is well known (and easy to prove) that any two finite dimensional inner product spaces are isometrically isomorphic. Hence $ langle x, y rangle'$ is an inner product on $mathbb R^n$ iff there is a vector space isomorphism $T: mathbb R^n to mathbb R^n$ such that $langle x, y rangle' =langle Tx, Ty rangle$ for all $x,y$.






                                    share|cite|improve this answer









                                    $endgroup$

















                                      0












                                      $begingroup$

                                      It is well known (and easy to prove) that any two finite dimensional inner product spaces are isometrically isomorphic. Hence $ langle x, y rangle'$ is an inner product on $mathbb R^n$ iff there is a vector space isomorphism $T: mathbb R^n to mathbb R^n$ such that $langle x, y rangle' =langle Tx, Ty rangle$ for all $x,y$.






                                      share|cite|improve this answer









                                      $endgroup$















                                        0












                                        0








                                        0





                                        $begingroup$

                                        It is well known (and easy to prove) that any two finite dimensional inner product spaces are isometrically isomorphic. Hence $ langle x, y rangle'$ is an inner product on $mathbb R^n$ iff there is a vector space isomorphism $T: mathbb R^n to mathbb R^n$ such that $langle x, y rangle' =langle Tx, Ty rangle$ for all $x,y$.






                                        share|cite|improve this answer









                                        $endgroup$



                                        It is well known (and easy to prove) that any two finite dimensional inner product spaces are isometrically isomorphic. Hence $ langle x, y rangle'$ is an inner product on $mathbb R^n$ iff there is a vector space isomorphism $T: mathbb R^n to mathbb R^n$ such that $langle x, y rangle' =langle Tx, Ty rangle$ for all $x,y$.







                                        share|cite|improve this answer












                                        share|cite|improve this answer



                                        share|cite|improve this answer










                                        answered 2 days ago









                                        Kavi Rama MurthyKavi Rama Murthy

                                        66.3k42867




                                        66.3k42867





















                                            0












                                            $begingroup$

                                            In general, for any $ntimes n$ matrix $A=(a_i,j)_i,j=1,ldots,n$ the expression $sum_i,j=1^na_i,jx_iy_j$ defines a bilinear form, which will be symmetric if and only $A$ is. Giving a positive definite symmetric bilinear form is a more subtle condition that leads to inequalities for the coefficients of $A$ (and the matrices that satisfy the condition are naturally called positive definite). For $2times2$ symmetric matrices the positive definite condition is $a_1,1>0$, $a_2,2>0$ together with $a_1,1a_2,2-a_1,2^2>0$ (so $det(A)>0$). For a concrete example, the symmetric matrix
                                            $$
                                            A=pmatrix1&frac12\frac12&1
                                            quadtextgives an inner product with
                                            langle v,,wrangle = v_1w_1 +frac12(v_1w_2+v_2w_1) + v_2w_2,.
                                            $$



                                            In higher dimension the condition is more complicated, but in any case one does get many different inner products on $Bbb R^n$ in this way. They do turn out to be all equivalent in the sense that they give rise to the same structure theory, but they are not equal.






                                            share|cite|improve this answer











                                            $endgroup$

















                                              0












                                              $begingroup$

                                              In general, for any $ntimes n$ matrix $A=(a_i,j)_i,j=1,ldots,n$ the expression $sum_i,j=1^na_i,jx_iy_j$ defines a bilinear form, which will be symmetric if and only $A$ is. Giving a positive definite symmetric bilinear form is a more subtle condition that leads to inequalities for the coefficients of $A$ (and the matrices that satisfy the condition are naturally called positive definite). For $2times2$ symmetric matrices the positive definite condition is $a_1,1>0$, $a_2,2>0$ together with $a_1,1a_2,2-a_1,2^2>0$ (so $det(A)>0$). For a concrete example, the symmetric matrix
                                              $$
                                              A=pmatrix1&frac12\frac12&1
                                              quadtextgives an inner product with
                                              langle v,,wrangle = v_1w_1 +frac12(v_1w_2+v_2w_1) + v_2w_2,.
                                              $$



                                              In higher dimension the condition is more complicated, but in any case one does get many different inner products on $Bbb R^n$ in this way. They do turn out to be all equivalent in the sense that they give rise to the same structure theory, but they are not equal.






                                              share|cite|improve this answer











                                              $endgroup$















                                                0












                                                0








                                                0





                                                $begingroup$

                                                In general, for any $ntimes n$ matrix $A=(a_i,j)_i,j=1,ldots,n$ the expression $sum_i,j=1^na_i,jx_iy_j$ defines a bilinear form, which will be symmetric if and only $A$ is. Giving a positive definite symmetric bilinear form is a more subtle condition that leads to inequalities for the coefficients of $A$ (and the matrices that satisfy the condition are naturally called positive definite). For $2times2$ symmetric matrices the positive definite condition is $a_1,1>0$, $a_2,2>0$ together with $a_1,1a_2,2-a_1,2^2>0$ (so $det(A)>0$). For a concrete example, the symmetric matrix
                                                $$
                                                A=pmatrix1&frac12\frac12&1
                                                quadtextgives an inner product with
                                                langle v,,wrangle = v_1w_1 +frac12(v_1w_2+v_2w_1) + v_2w_2,.
                                                $$



                                                In higher dimension the condition is more complicated, but in any case one does get many different inner products on $Bbb R^n$ in this way. They do turn out to be all equivalent in the sense that they give rise to the same structure theory, but they are not equal.






                                                share|cite|improve this answer











                                                $endgroup$



                                                In general, for any $ntimes n$ matrix $A=(a_i,j)_i,j=1,ldots,n$ the expression $sum_i,j=1^na_i,jx_iy_j$ defines a bilinear form, which will be symmetric if and only $A$ is. Giving a positive definite symmetric bilinear form is a more subtle condition that leads to inequalities for the coefficients of $A$ (and the matrices that satisfy the condition are naturally called positive definite). For $2times2$ symmetric matrices the positive definite condition is $a_1,1>0$, $a_2,2>0$ together with $a_1,1a_2,2-a_1,2^2>0$ (so $det(A)>0$). For a concrete example, the symmetric matrix
                                                $$
                                                A=pmatrix1&frac12\frac12&1
                                                quadtextgives an inner product with
                                                langle v,,wrangle = v_1w_1 +frac12(v_1w_2+v_2w_1) + v_2w_2,.
                                                $$



                                                In higher dimension the condition is more complicated, but in any case one does get many different inner products on $Bbb R^n$ in this way. They do turn out to be all equivalent in the sense that they give rise to the same structure theory, but they are not equal.







                                                share|cite|improve this answer














                                                share|cite|improve this answer



                                                share|cite|improve this answer








                                                edited 9 hours ago

























                                                answered yesterday









                                                Marc van LeeuwenMarc van Leeuwen

                                                88.2k5111228




                                                88.2k5111228



























                                                    draft saved

                                                    draft discarded
















































                                                    Thanks for contributing an answer to Mathematics Stack Exchange!


                                                    • Please be sure to answer the question. Provide details and share your research!

                                                    But avoid


                                                    • Asking for help, clarification, or responding to other answers.

                                                    • Making statements based on opinion; back them up with references or personal experience.

                                                    Use MathJax to format equations. MathJax reference.


                                                    To learn more, see our tips on writing great answers.




                                                    draft saved


                                                    draft discarded














                                                    StackExchange.ready(
                                                    function ()
                                                    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3141609%2fother-inner-products-for-mathbbrn%23new-answer', 'question_page');

                                                    );

                                                    Post as a guest















                                                    Required, but never shown





















































                                                    Required, but never shown














                                                    Required, but never shown












                                                    Required, but never shown







                                                    Required, but never shown

































                                                    Required, but never shown














                                                    Required, but never shown












                                                    Required, but never shown







                                                    Required, but never shown







                                                    Popular posts from this blog

                                                    How should I support this large drywall patch? Planned maintenance scheduled April 23, 2019 at 00:00UTC (8:00pm US/Eastern) Announcing the arrival of Valued Associate #679: Cesar Manara Unicorn Meta Zoo #1: Why another podcast?How do I cover large gaps in drywall?How do I keep drywall around a patch from crumbling?Can I glue a second layer of drywall?How to patch long strip on drywall?Large drywall patch: how to avoid bulging seams?Drywall Mesh Patch vs. Bulge? To remove or not to remove?How to fix this drywall job?Prep drywall before backsplashWhat's the best way to fix this horrible drywall patch job?Drywall patching using 3M Patch Plus Primer

                                                    random experiment with two different functions on unit interval Announcing the arrival of Valued Associate #679: Cesar Manara Planned maintenance scheduled April 23, 2019 at 00:00UTC (8:00pm US/Eastern)Random variable and probability space notionsRandom Walk with EdgesFinding functions where the increase over a random interval is Poisson distributedNumber of days until dayCan an observed event in fact be of zero probability?Unit random processmodels of coins and uniform distributionHow to get the number of successes given $n$ trials , probability $P$ and a random variable $X$Absorbing Markov chain in a computer. Is “almost every” turned into always convergence in computer executions?Stopped random walk is not uniformly integrable

                                                    Lowndes Grove History Architecture References Navigation menu32°48′6″N 79°57′58″W / 32.80167°N 79.96611°W / 32.80167; -79.9661132°48′6″N 79°57′58″W / 32.80167°N 79.96611°W / 32.80167; -79.9661178002500"National Register Information System"Historic houses of South Carolina"Lowndes Grove""+32° 48' 6.00", −79° 57' 58.00""Lowndes Grove, Charleston County (260 St. Margaret St., Charleston)""Lowndes Grove"The Charleston ExpositionIt Happened in South Carolina"Lowndes Grove (House), Saint Margaret Street & Sixth Avenue, Charleston, Charleston County, SC(Photographs)"Plantations of the Carolina Low Countrye