Is geodesic distance equivalent to “norm distance” in $SL_n(mathbbR)$? Announcing the arrival of Valued Associate #679: Cesar Manara Planned maintenance scheduled April 17/18, 2019 at 00:00UTC (8:00pm US/Eastern)Covering a Riemannian manifold with geodesic balls without too much overlapShow that the convex neighborhood in a Riemannian Manifold are subset contractiblesgeodesic flow is proper actionQuestions about a flat Riemannian metric and a metric spaceIntegrate geodesic equations to compute intrinsic induced metricExamples for geodesic balls which are not convexvolume of geodesic balls for large Riemannian metricsRiemannian metric with specified totally geodesic submanifoldsWhat do radially symmetric functions on a Riemannian symmetric space look like?Standard form of the Riemannian metric in a neighborhood

How can I make names more distinctive without making them longer?

Dating a Former Employee

What's the purpose of writing one's academic biography in the third person?

Should I use a zero-interest credit card for a large one-time purchase?

Why was the term "discrete" used in discrete logarithm?

English words in a non-english sci-fi novel

How do I keep my slimes from escaping their pens?

How widely used is the term Treppenwitz? Is it something that most Germans know?

Echoing a tail command produces unexpected output?

Identifying polygons that intersect with another layer using QGIS?

How to deal with a team lead who never gives me credit?

Seeking colloquialism for “just because”

What are the pros and cons of Aerospike nosecones?

Extract all GPU name, model and GPU ram

Error "illegal generic type for instanceof" when using local classes

Why did the Falcon Heavy center core fall off the ASDS OCISLY barge?

Output the ŋarâþ crîþ alphabet song without using (m)any letters

How to tell that you are a giant?

Why are Kinder Surprise Eggs illegal in the USA?

Why did the IBM 650 use bi-quinary?

Using audio cues to encourage good posture

What does this icon in iOS Stardew Valley mean?

Denied boarding although I have proper visa and documentation. To whom should I make a complaint?

Sci-Fi book where patients in a coma ward all live in a subconscious world linked together



Is geodesic distance equivalent to “norm distance” in $SL_n(mathbbR)$?



Announcing the arrival of Valued Associate #679: Cesar Manara
Planned maintenance scheduled April 17/18, 2019 at 00:00UTC (8:00pm US/Eastern)Covering a Riemannian manifold with geodesic balls without too much overlapShow that the convex neighborhood in a Riemannian Manifold are subset contractiblesgeodesic flow is proper actionQuestions about a flat Riemannian metric and a metric spaceIntegrate geodesic equations to compute intrinsic induced metricExamples for geodesic balls which are not convexvolume of geodesic balls for large Riemannian metricsRiemannian metric with specified totally geodesic submanifoldsWhat do radially symmetric functions on a Riemannian symmetric space look like?Standard form of the Riemannian metric in a neighborhood










11












$begingroup$


Take any norm, $|cdot|$on $mathbbR^n,$ and consider the resulting norm on $SL_n(mathbbR)$:



$$|A|:= supAv.$$



Now take any left-invariant Riemannian metric, $g$, on $SL_n$. How do the geodesic balls, $B_g(I, r)$ around the identity matrix, $I$, compare with the metric balls, $B_(I,r)$ coming from $|cdot|$? In particular do there exist $c, C$ such that $$B_(I,cr)subset B_g(I, r) subset B_(I,Cr)$$ for all sufficiently small $r$? Or anything of the sort?










share|cite|improve this question











$endgroup$
















    11












    $begingroup$


    Take any norm, $|cdot|$on $mathbbR^n,$ and consider the resulting norm on $SL_n(mathbbR)$:



    $$|A|:= supAv.$$



    Now take any left-invariant Riemannian metric, $g$, on $SL_n$. How do the geodesic balls, $B_g(I, r)$ around the identity matrix, $I$, compare with the metric balls, $B_(I,r)$ coming from $|cdot|$? In particular do there exist $c, C$ such that $$B_(I,cr)subset B_g(I, r) subset B_(I,Cr)$$ for all sufficiently small $r$? Or anything of the sort?










    share|cite|improve this question











    $endgroup$














      11












      11








      11


      0



      $begingroup$


      Take any norm, $|cdot|$on $mathbbR^n,$ and consider the resulting norm on $SL_n(mathbbR)$:



      $$|A|:= supAv.$$



      Now take any left-invariant Riemannian metric, $g$, on $SL_n$. How do the geodesic balls, $B_g(I, r)$ around the identity matrix, $I$, compare with the metric balls, $B_(I,r)$ coming from $|cdot|$? In particular do there exist $c, C$ such that $$B_(I,cr)subset B_g(I, r) subset B_(I,Cr)$$ for all sufficiently small $r$? Or anything of the sort?










      share|cite|improve this question











      $endgroup$




      Take any norm, $|cdot|$on $mathbbR^n,$ and consider the resulting norm on $SL_n(mathbbR)$:



      $$|A|:= supAv.$$



      Now take any left-invariant Riemannian metric, $g$, on $SL_n$. How do the geodesic balls, $B_g(I, r)$ around the identity matrix, $I$, compare with the metric balls, $B_(I,r)$ coming from $|cdot|$? In particular do there exist $c, C$ such that $$B_(I,cr)subset B_g(I, r) subset B_(I,Cr)$$ for all sufficiently small $r$? Or anything of the sort?







      differential-geometry lie-groups riemannian-geometry






      share|cite|improve this question















      share|cite|improve this question













      share|cite|improve this question




      share|cite|improve this question








      edited Feb 23 '16 at 7:38







      Tim kinsella

















      asked Feb 23 '16 at 5:42









      Tim kinsellaTim kinsella

      3,0921330




      3,0921330




















          2 Answers
          2






          active

          oldest

          votes


















          2












          $begingroup$

          Here's what I see in Einsiedler-Ward:



          Your norm on $SL_n(mathbbR)$ is induced by the operator norm on the vector space $M_n(mathbbR)$. Being a norm on a finite-dimensional vector space, it is equivalent to the euclidean norm $|cdot|$ on $M_n(mathbbR)$.



          Now any left-invariant Riemannian metric $langle ,rangle$ on $TG=Gtimes mathfrakg$ is determined by its restriction to $TG_I=Itimes mathfrakg cong mathfrakg subset M_n(mathbbR)$. Without loss of generality, we can assume that the Riemannian metric restricted to $mathfrakg$ is induced by the euclidean norm on $M_n(mathbbR).$ Let $d$ be the distance on $G$ induced by the path integral formula of $langle , rangle$.



          Let $B$ be a pre-compact neighbourhood of $I$ in $G$ where the local inverse ($log$) of the exponential map is defined. Assume $log(B)$ is a convex ball in $mathfrakg$. Let $B'$ be another pre-compact neighbourhood containing the closure of $B$.



          Say $phi:[0,1] to B'$ joins $g_0, g_1 in B$. Then since the norm of $phi(t)$ is bounded, we get $c>0$ (independent of $g_0,g_1$) such that



          $$L(phi):= intlangle Dphi(t), Dphi(t) rangle^1/2dt = int leftlangle DL^-1_phi(t)circ Dphi(t), DL^-1_phi(t)circ Dphi(t)rightrangle^1/2 dt = int |phi(t)^-1phi'(t)| dt
          \ geq cint|phi'(t)|dt geq c| g_1-g_0|.$$



          This shows that $c|g_1-g_0| leq d(g_0,g_1)$ if the infimum of path integrals is taken over paths which remain in $B'$. But since $dleft(B,(B')^cright)>0,$ we can assume that this estimate holds in general. Hence for all $g_0,g_1 in B$, we have



          beginequation
          c|g_1-g_0| leq d(g_0,g_1) qquad (1)
          endequation



          and it remains to show a reverse inequality.



          Consider the path $phi:[0,1] to B$ given by $t mapsto expleft(log g_0 + t(log g_1-log g_0)right)$. This is well defined since we assumed $log(B)$ was a convex ball in $mathfrakg$. Then, since the norm of $phi(t)^-1$ is bounded, and since $d(exp)$ is bounded in $log(B)$ and since $log$ is Lipschitz (by the mean value theorem) in a neighbourhood of $I$,



          $$d(g_0,g_1) leq intlangle Dphi(t), Dphi(t) rangle^1/2dt = int |phi(t)^-1phi'(t)|dt leq int C_1|phi'(t)|dt leq C_1C_2|g_1-g_0|$$



          for some $C_1, C_2>0$ (independent of $g_0, g_1$). Hence for all $g_0,g_1in B$, we have



          $$ d(g_0,g_1) leq C |g_0-g_1|. qquad (2)$$






          share|cite|improve this answer











          $endgroup$












          • $begingroup$
            (+1) .....................
            $endgroup$
            – Tim kinsella
            Mar 29 at 5:32










          • $begingroup$
            Thank you sir. And if you get the time, could you do the same with your $11$ other accounts?
            $endgroup$
            – Sir Wilfred Lucas-Dockery
            Mar 29 at 17:59


















          1












          $begingroup$

          I can prove the following. If $h$ denotes the metric on $SL(n,mathbbR)$ coming from the operator norm, and $h_1$ denotes a left-invariant metric on $SL(n,mathbbR)$, then, at $g in SL(n,mathbbR)$, and for any vector $x$ tangent to $SL(n,mathbbR)$ at $g$, we have:



          $h(x,x) leq |g|^2 h_1(x,x)$



          I just used left translations, and things like that. If interested, I can write some more details. Also, one can prove (using the equivalence of any 2 norms on a finite-dimensional vector space) that there is a $C>0$ such that:



          $h_1(x,x) <= C |g^-1|^2 h(x,x)$.



          Hence, if $|g|$ and $|g^-1|$ are bounded above by some constants, then the two metrics induce uniformly equivalent norms on the tangent spaces of that region. In particular, there exists a neighborhood of the identity in $SL(n,mathbbR)$ for which the two geodesic distances are equivalent.






          share|cite|improve this answer











          $endgroup$












          • $begingroup$
            Thanks for your answer. By $h(x,x)$ do you just mean the square of the operator norm of $x$, thinking of $x$ as a matrix acting on $mathbbR^n$ (let's assume g=e)?
            $endgroup$
            – Tim kinsella
            Apr 23 '16 at 0:25











          • $begingroup$
            Tim kinsella: yes correct, this is what I mean.
            $endgroup$
            – Malkoun
            Apr 23 '16 at 0:29










          • $begingroup$
            Thanks again. I don't completely understand the last paragraph. If we were talking about two Riemannian metrics which induced uniformly equivalent norms in a neighborhood of $e$, then I think I would be able to fill in the details. But let me ponder a little more.
            $endgroup$
            – Tim kinsella
            Apr 23 '16 at 0:43










          • $begingroup$
            Tim kinsella: I must admit the last paragraph in my answer is not written in a clear way! However the 2 inequalities should give you what you were hoping for. In any case, I can provide you with more details if you want, in particular the proofs of the 2 inequalities, if you want, or how you use them.
            $endgroup$
            – Malkoun
            Apr 23 '16 at 14:35











          Your Answer








          StackExchange.ready(function()
          var channelOptions =
          tags: "".split(" "),
          id: "69"
          ;
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function()
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled)
          StackExchange.using("snippets", function()
          createEditor();
          );

          else
          createEditor();

          );

          function createEditor()
          StackExchange.prepareEditor(
          heartbeatType: 'answer',
          autoActivateHeartbeat: false,
          convertImagesToLinks: true,
          noModals: true,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: 10,
          bindNavPrevention: true,
          postfix: "",
          imageUploader:
          brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
          contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
          allowUrls: true
          ,
          noCode: true, onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          );



          );













          draft saved

          draft discarded


















          StackExchange.ready(
          function ()
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f1668215%2fis-geodesic-distance-equivalent-to-norm-distance-in-sl-n-mathbbr%23new-answer', 'question_page');

          );

          Post as a guest















          Required, but never shown

























          2 Answers
          2






          active

          oldest

          votes








          2 Answers
          2






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes









          2












          $begingroup$

          Here's what I see in Einsiedler-Ward:



          Your norm on $SL_n(mathbbR)$ is induced by the operator norm on the vector space $M_n(mathbbR)$. Being a norm on a finite-dimensional vector space, it is equivalent to the euclidean norm $|cdot|$ on $M_n(mathbbR)$.



          Now any left-invariant Riemannian metric $langle ,rangle$ on $TG=Gtimes mathfrakg$ is determined by its restriction to $TG_I=Itimes mathfrakg cong mathfrakg subset M_n(mathbbR)$. Without loss of generality, we can assume that the Riemannian metric restricted to $mathfrakg$ is induced by the euclidean norm on $M_n(mathbbR).$ Let $d$ be the distance on $G$ induced by the path integral formula of $langle , rangle$.



          Let $B$ be a pre-compact neighbourhood of $I$ in $G$ where the local inverse ($log$) of the exponential map is defined. Assume $log(B)$ is a convex ball in $mathfrakg$. Let $B'$ be another pre-compact neighbourhood containing the closure of $B$.



          Say $phi:[0,1] to B'$ joins $g_0, g_1 in B$. Then since the norm of $phi(t)$ is bounded, we get $c>0$ (independent of $g_0,g_1$) such that



          $$L(phi):= intlangle Dphi(t), Dphi(t) rangle^1/2dt = int leftlangle DL^-1_phi(t)circ Dphi(t), DL^-1_phi(t)circ Dphi(t)rightrangle^1/2 dt = int |phi(t)^-1phi'(t)| dt
          \ geq cint|phi'(t)|dt geq c| g_1-g_0|.$$



          This shows that $c|g_1-g_0| leq d(g_0,g_1)$ if the infimum of path integrals is taken over paths which remain in $B'$. But since $dleft(B,(B')^cright)>0,$ we can assume that this estimate holds in general. Hence for all $g_0,g_1 in B$, we have



          beginequation
          c|g_1-g_0| leq d(g_0,g_1) qquad (1)
          endequation



          and it remains to show a reverse inequality.



          Consider the path $phi:[0,1] to B$ given by $t mapsto expleft(log g_0 + t(log g_1-log g_0)right)$. This is well defined since we assumed $log(B)$ was a convex ball in $mathfrakg$. Then, since the norm of $phi(t)^-1$ is bounded, and since $d(exp)$ is bounded in $log(B)$ and since $log$ is Lipschitz (by the mean value theorem) in a neighbourhood of $I$,



          $$d(g_0,g_1) leq intlangle Dphi(t), Dphi(t) rangle^1/2dt = int |phi(t)^-1phi'(t)|dt leq int C_1|phi'(t)|dt leq C_1C_2|g_1-g_0|$$



          for some $C_1, C_2>0$ (independent of $g_0, g_1$). Hence for all $g_0,g_1in B$, we have



          $$ d(g_0,g_1) leq C |g_0-g_1|. qquad (2)$$






          share|cite|improve this answer











          $endgroup$












          • $begingroup$
            (+1) .....................
            $endgroup$
            – Tim kinsella
            Mar 29 at 5:32










          • $begingroup$
            Thank you sir. And if you get the time, could you do the same with your $11$ other accounts?
            $endgroup$
            – Sir Wilfred Lucas-Dockery
            Mar 29 at 17:59















          2












          $begingroup$

          Here's what I see in Einsiedler-Ward:



          Your norm on $SL_n(mathbbR)$ is induced by the operator norm on the vector space $M_n(mathbbR)$. Being a norm on a finite-dimensional vector space, it is equivalent to the euclidean norm $|cdot|$ on $M_n(mathbbR)$.



          Now any left-invariant Riemannian metric $langle ,rangle$ on $TG=Gtimes mathfrakg$ is determined by its restriction to $TG_I=Itimes mathfrakg cong mathfrakg subset M_n(mathbbR)$. Without loss of generality, we can assume that the Riemannian metric restricted to $mathfrakg$ is induced by the euclidean norm on $M_n(mathbbR).$ Let $d$ be the distance on $G$ induced by the path integral formula of $langle , rangle$.



          Let $B$ be a pre-compact neighbourhood of $I$ in $G$ where the local inverse ($log$) of the exponential map is defined. Assume $log(B)$ is a convex ball in $mathfrakg$. Let $B'$ be another pre-compact neighbourhood containing the closure of $B$.



          Say $phi:[0,1] to B'$ joins $g_0, g_1 in B$. Then since the norm of $phi(t)$ is bounded, we get $c>0$ (independent of $g_0,g_1$) such that



          $$L(phi):= intlangle Dphi(t), Dphi(t) rangle^1/2dt = int leftlangle DL^-1_phi(t)circ Dphi(t), DL^-1_phi(t)circ Dphi(t)rightrangle^1/2 dt = int |phi(t)^-1phi'(t)| dt
          \ geq cint|phi'(t)|dt geq c| g_1-g_0|.$$



          This shows that $c|g_1-g_0| leq d(g_0,g_1)$ if the infimum of path integrals is taken over paths which remain in $B'$. But since $dleft(B,(B')^cright)>0,$ we can assume that this estimate holds in general. Hence for all $g_0,g_1 in B$, we have



          beginequation
          c|g_1-g_0| leq d(g_0,g_1) qquad (1)
          endequation



          and it remains to show a reverse inequality.



          Consider the path $phi:[0,1] to B$ given by $t mapsto expleft(log g_0 + t(log g_1-log g_0)right)$. This is well defined since we assumed $log(B)$ was a convex ball in $mathfrakg$. Then, since the norm of $phi(t)^-1$ is bounded, and since $d(exp)$ is bounded in $log(B)$ and since $log$ is Lipschitz (by the mean value theorem) in a neighbourhood of $I$,



          $$d(g_0,g_1) leq intlangle Dphi(t), Dphi(t) rangle^1/2dt = int |phi(t)^-1phi'(t)|dt leq int C_1|phi'(t)|dt leq C_1C_2|g_1-g_0|$$



          for some $C_1, C_2>0$ (independent of $g_0, g_1$). Hence for all $g_0,g_1in B$, we have



          $$ d(g_0,g_1) leq C |g_0-g_1|. qquad (2)$$






          share|cite|improve this answer











          $endgroup$












          • $begingroup$
            (+1) .....................
            $endgroup$
            – Tim kinsella
            Mar 29 at 5:32










          • $begingroup$
            Thank you sir. And if you get the time, could you do the same with your $11$ other accounts?
            $endgroup$
            – Sir Wilfred Lucas-Dockery
            Mar 29 at 17:59













          2












          2








          2





          $begingroup$

          Here's what I see in Einsiedler-Ward:



          Your norm on $SL_n(mathbbR)$ is induced by the operator norm on the vector space $M_n(mathbbR)$. Being a norm on a finite-dimensional vector space, it is equivalent to the euclidean norm $|cdot|$ on $M_n(mathbbR)$.



          Now any left-invariant Riemannian metric $langle ,rangle$ on $TG=Gtimes mathfrakg$ is determined by its restriction to $TG_I=Itimes mathfrakg cong mathfrakg subset M_n(mathbbR)$. Without loss of generality, we can assume that the Riemannian metric restricted to $mathfrakg$ is induced by the euclidean norm on $M_n(mathbbR).$ Let $d$ be the distance on $G$ induced by the path integral formula of $langle , rangle$.



          Let $B$ be a pre-compact neighbourhood of $I$ in $G$ where the local inverse ($log$) of the exponential map is defined. Assume $log(B)$ is a convex ball in $mathfrakg$. Let $B'$ be another pre-compact neighbourhood containing the closure of $B$.



          Say $phi:[0,1] to B'$ joins $g_0, g_1 in B$. Then since the norm of $phi(t)$ is bounded, we get $c>0$ (independent of $g_0,g_1$) such that



          $$L(phi):= intlangle Dphi(t), Dphi(t) rangle^1/2dt = int leftlangle DL^-1_phi(t)circ Dphi(t), DL^-1_phi(t)circ Dphi(t)rightrangle^1/2 dt = int |phi(t)^-1phi'(t)| dt
          \ geq cint|phi'(t)|dt geq c| g_1-g_0|.$$



          This shows that $c|g_1-g_0| leq d(g_0,g_1)$ if the infimum of path integrals is taken over paths which remain in $B'$. But since $dleft(B,(B')^cright)>0,$ we can assume that this estimate holds in general. Hence for all $g_0,g_1 in B$, we have



          beginequation
          c|g_1-g_0| leq d(g_0,g_1) qquad (1)
          endequation



          and it remains to show a reverse inequality.



          Consider the path $phi:[0,1] to B$ given by $t mapsto expleft(log g_0 + t(log g_1-log g_0)right)$. This is well defined since we assumed $log(B)$ was a convex ball in $mathfrakg$. Then, since the norm of $phi(t)^-1$ is bounded, and since $d(exp)$ is bounded in $log(B)$ and since $log$ is Lipschitz (by the mean value theorem) in a neighbourhood of $I$,



          $$d(g_0,g_1) leq intlangle Dphi(t), Dphi(t) rangle^1/2dt = int |phi(t)^-1phi'(t)|dt leq int C_1|phi'(t)|dt leq C_1C_2|g_1-g_0|$$



          for some $C_1, C_2>0$ (independent of $g_0, g_1$). Hence for all $g_0,g_1in B$, we have



          $$ d(g_0,g_1) leq C |g_0-g_1|. qquad (2)$$






          share|cite|improve this answer











          $endgroup$



          Here's what I see in Einsiedler-Ward:



          Your norm on $SL_n(mathbbR)$ is induced by the operator norm on the vector space $M_n(mathbbR)$. Being a norm on a finite-dimensional vector space, it is equivalent to the euclidean norm $|cdot|$ on $M_n(mathbbR)$.



          Now any left-invariant Riemannian metric $langle ,rangle$ on $TG=Gtimes mathfrakg$ is determined by its restriction to $TG_I=Itimes mathfrakg cong mathfrakg subset M_n(mathbbR)$. Without loss of generality, we can assume that the Riemannian metric restricted to $mathfrakg$ is induced by the euclidean norm on $M_n(mathbbR).$ Let $d$ be the distance on $G$ induced by the path integral formula of $langle , rangle$.



          Let $B$ be a pre-compact neighbourhood of $I$ in $G$ where the local inverse ($log$) of the exponential map is defined. Assume $log(B)$ is a convex ball in $mathfrakg$. Let $B'$ be another pre-compact neighbourhood containing the closure of $B$.



          Say $phi:[0,1] to B'$ joins $g_0, g_1 in B$. Then since the norm of $phi(t)$ is bounded, we get $c>0$ (independent of $g_0,g_1$) such that



          $$L(phi):= intlangle Dphi(t), Dphi(t) rangle^1/2dt = int leftlangle DL^-1_phi(t)circ Dphi(t), DL^-1_phi(t)circ Dphi(t)rightrangle^1/2 dt = int |phi(t)^-1phi'(t)| dt
          \ geq cint|phi'(t)|dt geq c| g_1-g_0|.$$



          This shows that $c|g_1-g_0| leq d(g_0,g_1)$ if the infimum of path integrals is taken over paths which remain in $B'$. But since $dleft(B,(B')^cright)>0,$ we can assume that this estimate holds in general. Hence for all $g_0,g_1 in B$, we have



          beginequation
          c|g_1-g_0| leq d(g_0,g_1) qquad (1)
          endequation



          and it remains to show a reverse inequality.



          Consider the path $phi:[0,1] to B$ given by $t mapsto expleft(log g_0 + t(log g_1-log g_0)right)$. This is well defined since we assumed $log(B)$ was a convex ball in $mathfrakg$. Then, since the norm of $phi(t)^-1$ is bounded, and since $d(exp)$ is bounded in $log(B)$ and since $log$ is Lipschitz (by the mean value theorem) in a neighbourhood of $I$,



          $$d(g_0,g_1) leq intlangle Dphi(t), Dphi(t) rangle^1/2dt = int |phi(t)^-1phi'(t)|dt leq int C_1|phi'(t)|dt leq C_1C_2|g_1-g_0|$$



          for some $C_1, C_2>0$ (independent of $g_0, g_1$). Hence for all $g_0,g_1in B$, we have



          $$ d(g_0,g_1) leq C |g_0-g_1|. qquad (2)$$







          share|cite|improve this answer














          share|cite|improve this answer



          share|cite|improve this answer








          edited Mar 28 at 15:41

























          answered Mar 26 at 17:37









          Sir Wilfred Lucas-DockerySir Wilfred Lucas-Dockery

          409420




          409420











          • $begingroup$
            (+1) .....................
            $endgroup$
            – Tim kinsella
            Mar 29 at 5:32










          • $begingroup$
            Thank you sir. And if you get the time, could you do the same with your $11$ other accounts?
            $endgroup$
            – Sir Wilfred Lucas-Dockery
            Mar 29 at 17:59
















          • $begingroup$
            (+1) .....................
            $endgroup$
            – Tim kinsella
            Mar 29 at 5:32










          • $begingroup$
            Thank you sir. And if you get the time, could you do the same with your $11$ other accounts?
            $endgroup$
            – Sir Wilfred Lucas-Dockery
            Mar 29 at 17:59















          $begingroup$
          (+1) .....................
          $endgroup$
          – Tim kinsella
          Mar 29 at 5:32




          $begingroup$
          (+1) .....................
          $endgroup$
          – Tim kinsella
          Mar 29 at 5:32












          $begingroup$
          Thank you sir. And if you get the time, could you do the same with your $11$ other accounts?
          $endgroup$
          – Sir Wilfred Lucas-Dockery
          Mar 29 at 17:59




          $begingroup$
          Thank you sir. And if you get the time, could you do the same with your $11$ other accounts?
          $endgroup$
          – Sir Wilfred Lucas-Dockery
          Mar 29 at 17:59











          1












          $begingroup$

          I can prove the following. If $h$ denotes the metric on $SL(n,mathbbR)$ coming from the operator norm, and $h_1$ denotes a left-invariant metric on $SL(n,mathbbR)$, then, at $g in SL(n,mathbbR)$, and for any vector $x$ tangent to $SL(n,mathbbR)$ at $g$, we have:



          $h(x,x) leq |g|^2 h_1(x,x)$



          I just used left translations, and things like that. If interested, I can write some more details. Also, one can prove (using the equivalence of any 2 norms on a finite-dimensional vector space) that there is a $C>0$ such that:



          $h_1(x,x) <= C |g^-1|^2 h(x,x)$.



          Hence, if $|g|$ and $|g^-1|$ are bounded above by some constants, then the two metrics induce uniformly equivalent norms on the tangent spaces of that region. In particular, there exists a neighborhood of the identity in $SL(n,mathbbR)$ for which the two geodesic distances are equivalent.






          share|cite|improve this answer











          $endgroup$












          • $begingroup$
            Thanks for your answer. By $h(x,x)$ do you just mean the square of the operator norm of $x$, thinking of $x$ as a matrix acting on $mathbbR^n$ (let's assume g=e)?
            $endgroup$
            – Tim kinsella
            Apr 23 '16 at 0:25











          • $begingroup$
            Tim kinsella: yes correct, this is what I mean.
            $endgroup$
            – Malkoun
            Apr 23 '16 at 0:29










          • $begingroup$
            Thanks again. I don't completely understand the last paragraph. If we were talking about two Riemannian metrics which induced uniformly equivalent norms in a neighborhood of $e$, then I think I would be able to fill in the details. But let me ponder a little more.
            $endgroup$
            – Tim kinsella
            Apr 23 '16 at 0:43










          • $begingroup$
            Tim kinsella: I must admit the last paragraph in my answer is not written in a clear way! However the 2 inequalities should give you what you were hoping for. In any case, I can provide you with more details if you want, in particular the proofs of the 2 inequalities, if you want, or how you use them.
            $endgroup$
            – Malkoun
            Apr 23 '16 at 14:35















          1












          $begingroup$

          I can prove the following. If $h$ denotes the metric on $SL(n,mathbbR)$ coming from the operator norm, and $h_1$ denotes a left-invariant metric on $SL(n,mathbbR)$, then, at $g in SL(n,mathbbR)$, and for any vector $x$ tangent to $SL(n,mathbbR)$ at $g$, we have:



          $h(x,x) leq |g|^2 h_1(x,x)$



          I just used left translations, and things like that. If interested, I can write some more details. Also, one can prove (using the equivalence of any 2 norms on a finite-dimensional vector space) that there is a $C>0$ such that:



          $h_1(x,x) <= C |g^-1|^2 h(x,x)$.



          Hence, if $|g|$ and $|g^-1|$ are bounded above by some constants, then the two metrics induce uniformly equivalent norms on the tangent spaces of that region. In particular, there exists a neighborhood of the identity in $SL(n,mathbbR)$ for which the two geodesic distances are equivalent.






          share|cite|improve this answer











          $endgroup$












          • $begingroup$
            Thanks for your answer. By $h(x,x)$ do you just mean the square of the operator norm of $x$, thinking of $x$ as a matrix acting on $mathbbR^n$ (let's assume g=e)?
            $endgroup$
            – Tim kinsella
            Apr 23 '16 at 0:25











          • $begingroup$
            Tim kinsella: yes correct, this is what I mean.
            $endgroup$
            – Malkoun
            Apr 23 '16 at 0:29










          • $begingroup$
            Thanks again. I don't completely understand the last paragraph. If we were talking about two Riemannian metrics which induced uniformly equivalent norms in a neighborhood of $e$, then I think I would be able to fill in the details. But let me ponder a little more.
            $endgroup$
            – Tim kinsella
            Apr 23 '16 at 0:43










          • $begingroup$
            Tim kinsella: I must admit the last paragraph in my answer is not written in a clear way! However the 2 inequalities should give you what you were hoping for. In any case, I can provide you with more details if you want, in particular the proofs of the 2 inequalities, if you want, or how you use them.
            $endgroup$
            – Malkoun
            Apr 23 '16 at 14:35













          1












          1








          1





          $begingroup$

          I can prove the following. If $h$ denotes the metric on $SL(n,mathbbR)$ coming from the operator norm, and $h_1$ denotes a left-invariant metric on $SL(n,mathbbR)$, then, at $g in SL(n,mathbbR)$, and for any vector $x$ tangent to $SL(n,mathbbR)$ at $g$, we have:



          $h(x,x) leq |g|^2 h_1(x,x)$



          I just used left translations, and things like that. If interested, I can write some more details. Also, one can prove (using the equivalence of any 2 norms on a finite-dimensional vector space) that there is a $C>0$ such that:



          $h_1(x,x) <= C |g^-1|^2 h(x,x)$.



          Hence, if $|g|$ and $|g^-1|$ are bounded above by some constants, then the two metrics induce uniformly equivalent norms on the tangent spaces of that region. In particular, there exists a neighborhood of the identity in $SL(n,mathbbR)$ for which the two geodesic distances are equivalent.






          share|cite|improve this answer











          $endgroup$



          I can prove the following. If $h$ denotes the metric on $SL(n,mathbbR)$ coming from the operator norm, and $h_1$ denotes a left-invariant metric on $SL(n,mathbbR)$, then, at $g in SL(n,mathbbR)$, and for any vector $x$ tangent to $SL(n,mathbbR)$ at $g$, we have:



          $h(x,x) leq |g|^2 h_1(x,x)$



          I just used left translations, and things like that. If interested, I can write some more details. Also, one can prove (using the equivalence of any 2 norms on a finite-dimensional vector space) that there is a $C>0$ such that:



          $h_1(x,x) <= C |g^-1|^2 h(x,x)$.



          Hence, if $|g|$ and $|g^-1|$ are bounded above by some constants, then the two metrics induce uniformly equivalent norms on the tangent spaces of that region. In particular, there exists a neighborhood of the identity in $SL(n,mathbbR)$ for which the two geodesic distances are equivalent.







          share|cite|improve this answer














          share|cite|improve this answer



          share|cite|improve this answer








          edited Apr 21 '16 at 14:40

























          answered Apr 21 '16 at 14:25









          MalkounMalkoun

          1,9321612




          1,9321612











          • $begingroup$
            Thanks for your answer. By $h(x,x)$ do you just mean the square of the operator norm of $x$, thinking of $x$ as a matrix acting on $mathbbR^n$ (let's assume g=e)?
            $endgroup$
            – Tim kinsella
            Apr 23 '16 at 0:25











          • $begingroup$
            Tim kinsella: yes correct, this is what I mean.
            $endgroup$
            – Malkoun
            Apr 23 '16 at 0:29










          • $begingroup$
            Thanks again. I don't completely understand the last paragraph. If we were talking about two Riemannian metrics which induced uniformly equivalent norms in a neighborhood of $e$, then I think I would be able to fill in the details. But let me ponder a little more.
            $endgroup$
            – Tim kinsella
            Apr 23 '16 at 0:43










          • $begingroup$
            Tim kinsella: I must admit the last paragraph in my answer is not written in a clear way! However the 2 inequalities should give you what you were hoping for. In any case, I can provide you with more details if you want, in particular the proofs of the 2 inequalities, if you want, or how you use them.
            $endgroup$
            – Malkoun
            Apr 23 '16 at 14:35
















          • $begingroup$
            Thanks for your answer. By $h(x,x)$ do you just mean the square of the operator norm of $x$, thinking of $x$ as a matrix acting on $mathbbR^n$ (let's assume g=e)?
            $endgroup$
            – Tim kinsella
            Apr 23 '16 at 0:25











          • $begingroup$
            Tim kinsella: yes correct, this is what I mean.
            $endgroup$
            – Malkoun
            Apr 23 '16 at 0:29










          • $begingroup$
            Thanks again. I don't completely understand the last paragraph. If we were talking about two Riemannian metrics which induced uniformly equivalent norms in a neighborhood of $e$, then I think I would be able to fill in the details. But let me ponder a little more.
            $endgroup$
            – Tim kinsella
            Apr 23 '16 at 0:43










          • $begingroup$
            Tim kinsella: I must admit the last paragraph in my answer is not written in a clear way! However the 2 inequalities should give you what you were hoping for. In any case, I can provide you with more details if you want, in particular the proofs of the 2 inequalities, if you want, or how you use them.
            $endgroup$
            – Malkoun
            Apr 23 '16 at 14:35















          $begingroup$
          Thanks for your answer. By $h(x,x)$ do you just mean the square of the operator norm of $x$, thinking of $x$ as a matrix acting on $mathbbR^n$ (let's assume g=e)?
          $endgroup$
          – Tim kinsella
          Apr 23 '16 at 0:25





          $begingroup$
          Thanks for your answer. By $h(x,x)$ do you just mean the square of the operator norm of $x$, thinking of $x$ as a matrix acting on $mathbbR^n$ (let's assume g=e)?
          $endgroup$
          – Tim kinsella
          Apr 23 '16 at 0:25













          $begingroup$
          Tim kinsella: yes correct, this is what I mean.
          $endgroup$
          – Malkoun
          Apr 23 '16 at 0:29




          $begingroup$
          Tim kinsella: yes correct, this is what I mean.
          $endgroup$
          – Malkoun
          Apr 23 '16 at 0:29












          $begingroup$
          Thanks again. I don't completely understand the last paragraph. If we were talking about two Riemannian metrics which induced uniformly equivalent norms in a neighborhood of $e$, then I think I would be able to fill in the details. But let me ponder a little more.
          $endgroup$
          – Tim kinsella
          Apr 23 '16 at 0:43




          $begingroup$
          Thanks again. I don't completely understand the last paragraph. If we were talking about two Riemannian metrics which induced uniformly equivalent norms in a neighborhood of $e$, then I think I would be able to fill in the details. But let me ponder a little more.
          $endgroup$
          – Tim kinsella
          Apr 23 '16 at 0:43












          $begingroup$
          Tim kinsella: I must admit the last paragraph in my answer is not written in a clear way! However the 2 inequalities should give you what you were hoping for. In any case, I can provide you with more details if you want, in particular the proofs of the 2 inequalities, if you want, or how you use them.
          $endgroup$
          – Malkoun
          Apr 23 '16 at 14:35




          $begingroup$
          Tim kinsella: I must admit the last paragraph in my answer is not written in a clear way! However the 2 inequalities should give you what you were hoping for. In any case, I can provide you with more details if you want, in particular the proofs of the 2 inequalities, if you want, or how you use them.
          $endgroup$
          – Malkoun
          Apr 23 '16 at 14:35

















          draft saved

          draft discarded
















































          Thanks for contributing an answer to Mathematics Stack Exchange!


          • Please be sure to answer the question. Provide details and share your research!

          But avoid


          • Asking for help, clarification, or responding to other answers.

          • Making statements based on opinion; back them up with references or personal experience.

          Use MathJax to format equations. MathJax reference.


          To learn more, see our tips on writing great answers.




          draft saved


          draft discarded














          StackExchange.ready(
          function ()
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f1668215%2fis-geodesic-distance-equivalent-to-norm-distance-in-sl-n-mathbbr%23new-answer', 'question_page');

          );

          Post as a guest















          Required, but never shown





















































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown

































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown







          Popular posts from this blog

          Lowndes Grove History Architecture References Navigation menu32°48′6″N 79°57′58″W / 32.80167°N 79.96611°W / 32.80167; -79.9661132°48′6″N 79°57′58″W / 32.80167°N 79.96611°W / 32.80167; -79.9661178002500"National Register Information System"Historic houses of South Carolina"Lowndes Grove""+32° 48' 6.00", −79° 57' 58.00""Lowndes Grove, Charleston County (260 St. Margaret St., Charleston)""Lowndes Grove"The Charleston ExpositionIt Happened in South Carolina"Lowndes Grove (House), Saint Margaret Street & Sixth Avenue, Charleston, Charleston County, SC(Photographs)"Plantations of the Carolina Low Countrye

          random experiment with two different functions on unit interval Announcing the arrival of Valued Associate #679: Cesar Manara Planned maintenance scheduled April 23, 2019 at 00:00UTC (8:00pm US/Eastern)Random variable and probability space notionsRandom Walk with EdgesFinding functions where the increase over a random interval is Poisson distributedNumber of days until dayCan an observed event in fact be of zero probability?Unit random processmodels of coins and uniform distributionHow to get the number of successes given $n$ trials , probability $P$ and a random variable $X$Absorbing Markov chain in a computer. Is “almost every” turned into always convergence in computer executions?Stopped random walk is not uniformly integrable

          How should I support this large drywall patch? Planned maintenance scheduled April 23, 2019 at 00:00UTC (8:00pm US/Eastern) Announcing the arrival of Valued Associate #679: Cesar Manara Unicorn Meta Zoo #1: Why another podcast?How do I cover large gaps in drywall?How do I keep drywall around a patch from crumbling?Can I glue a second layer of drywall?How to patch long strip on drywall?Large drywall patch: how to avoid bulging seams?Drywall Mesh Patch vs. Bulge? To remove or not to remove?How to fix this drywall job?Prep drywall before backsplashWhat's the best way to fix this horrible drywall patch job?Drywall patching using 3M Patch Plus Primer