Expectation of random process The Next CEO of Stack OverflowCompound Poisson process: calculate $Eleft( sum_k=1^N_tX_k e^t-T_k right)$, $X_k$ i.i.d., $T_k$ arrival timeexpectation value for minimum distance between random variablesCan sum of two random variables be uniformly distributedUncountable family of random variablesExpectation over 2 random variables, help neededRenewal process - sample spaceIs multiplication of a correlated random variable and a independent random variable, an independent random variableMean Value of a Random ProcessMean and Variance, Uniformly distributed random variablesWeak stationarity for a stochastic process $X(t) $

Is there a reasonable and studied concept of reduction between regular languages?

What flight has the highest ratio of timezone difference to flight time?

Can Sneak Attack be used when hitting with an improvised weapon?

Help! I cannot understand this game’s notations!

Why am I getting "Static method cannot be referenced from a non static context: String String.valueOf(Object)"?

How to Implement Deterministic Encryption Safely in .NET

Does the Idaho Potato Commission associate potato skins with healthy eating?

Can you teleport closer to a creature you are Frightened of?

Can I board the first leg of the flight without having final country's visa?

Traveling with my 5 year old daughter (as the father) without the mother from Germany to Mexico

What would be the main consequences for a country leaving the WTO?

Is it okay to majorly distort historical facts while writing a fiction story?

Why is information "lost" when it got into a black hole?

Help understanding this unsettling image of Titan, Epimetheus, and Saturn's rings?

Where do students learn to solve polynomial equations these days?

Purpose of level-shifter with same in and out voltages

Help/tips for a first time writer?

Which one is the true statement?

Computationally populating tables with probability data

How to find image of a complex function with given constraints?

Spaces in which all closed sets are regular closed

(How) Could a medieval fantasy world survive a magic-induced "nuclear winter"?

In the "Harry Potter and the Order of the Phoenix" video game, what potion is used to sabotage Umbridge's speakers?

Defamation due to breach of confidentiality



Expectation of random process



The Next CEO of Stack OverflowCompound Poisson process: calculate $Eleft( sum_k=1^N_tX_k e^t-T_k right)$, $X_k$ i.i.d., $T_k$ arrival timeexpectation value for minimum distance between random variablesCan sum of two random variables be uniformly distributedUncountable family of random variablesExpectation over 2 random variables, help neededRenewal process - sample spaceIs multiplication of a correlated random variable and a independent random variable, an independent random variableMean Value of a Random ProcessMean and Variance, Uniformly distributed random variablesWeak stationarity for a stochastic process $X(t) $










0












$begingroup$


I have a random process defined as



$$X(t) = A sin(omega t + phi)$$



where A and $omega$ are independent. $phi$ is distributed $U[0,2pi]$.



I would like to find $E[X(t)]$.



I believe the answer is $0$ because the multiplication of 2 expectation of independent random variables is the product of the expectation of each. i.e. $E[X(t)] = E[A] E[sin(omega t + phi)]$.



Since $phi$ is distributed uniformly over a cycle, the mean of the $sin()$ term must be zero thereby making the whole quantity 0.



Is this correct?










share|cite|improve this question









$endgroup$
















    0












    $begingroup$


    I have a random process defined as



    $$X(t) = A sin(omega t + phi)$$



    where A and $omega$ are independent. $phi$ is distributed $U[0,2pi]$.



    I would like to find $E[X(t)]$.



    I believe the answer is $0$ because the multiplication of 2 expectation of independent random variables is the product of the expectation of each. i.e. $E[X(t)] = E[A] E[sin(omega t + phi)]$.



    Since $phi$ is distributed uniformly over a cycle, the mean of the $sin()$ term must be zero thereby making the whole quantity 0.



    Is this correct?










    share|cite|improve this question









    $endgroup$














      0












      0








      0





      $begingroup$


      I have a random process defined as



      $$X(t) = A sin(omega t + phi)$$



      where A and $omega$ are independent. $phi$ is distributed $U[0,2pi]$.



      I would like to find $E[X(t)]$.



      I believe the answer is $0$ because the multiplication of 2 expectation of independent random variables is the product of the expectation of each. i.e. $E[X(t)] = E[A] E[sin(omega t + phi)]$.



      Since $phi$ is distributed uniformly over a cycle, the mean of the $sin()$ term must be zero thereby making the whole quantity 0.



      Is this correct?










      share|cite|improve this question









      $endgroup$




      I have a random process defined as



      $$X(t) = A sin(omega t + phi)$$



      where A and $omega$ are independent. $phi$ is distributed $U[0,2pi]$.



      I would like to find $E[X(t)]$.



      I believe the answer is $0$ because the multiplication of 2 expectation of independent random variables is the product of the expectation of each. i.e. $E[X(t)] = E[A] E[sin(omega t + phi)]$.



      Since $phi$ is distributed uniformly over a cycle, the mean of the $sin()$ term must be zero thereby making the whole quantity 0.



      Is this correct?







      stochastic-processes random-variables






      share|cite|improve this question













      share|cite|improve this question











      share|cite|improve this question




      share|cite|improve this question










      asked Mar 19 at 20:51









      AvedisAvedis

      667




      667




















          2 Answers
          2






          active

          oldest

          votes


















          1












          $begingroup$

          Yes, if $(A,omega,phi)$ is mutually independent.



          Indeed, if $(A,omega,phi)$ is mutually independent, then $A$ is independent of $(omega,phi)$ so you can write $mathbb E[X(t)]=mathbb E[A]mathbb E[sin(omega t+phi)]$.



          Moreover, $omega$ is independent of $phi$ so
          $$
          mathbb E[sin(omega t+phi)]=mathbb E[sin(omega t)cos(phi)]+mathbb E[sin(phi)cos(omega t)]=mathbb E[sin(omega t)]mathbb E[cos(phi)]+mathbb E[sin(phi)]mathbb E[cos(omega t)]=0
          $$



          I used the formula $sin(a+b)=sin(a)cos(b)+sin(b)cos(a)$ because it requires the least knowledge in probability theory. If you are more familiar with the probability theory, here is another way to conclude: since $omega$ and $phi$ are independent, we have
          $$
          mathbb E[sin(omega t+phi)]=mathbb E[f(omega t)],
          $$

          where $f:mathbb Rtomathbb R$ is defined for all $xinmathbb R$ by $f(x)=mathbb E[sin(x+phi)]=0$.






          share|cite|improve this answer









          $endgroup$




















            0












            $begingroup$

            $newcommandbbx[1],bbox[15px,border:1px groove navy]displaystyle#1,
            newcommandbraces[1]leftlbrace,#1,rightrbrace
            newcommandbracks[1]leftlbrack,#1,rightrbrack
            newcommandddmathrmd
            newcommandds[1]displaystyle#1
            newcommandexpo[1],mathrme^#1,
            newcommandicmathrmi
            newcommandmc[1]mathcal#1
            newcommandmrm[1]mathrm#1
            newcommandpars[1]left(,#1,right)
            newcommandpartiald[3][]fracpartial^#1 #2partial #3^#1
            newcommandroot[2][],sqrt[#1],#2,,
            newcommandtotald[3][]fracmathrmd^#1 #2mathrmd #3^#1
            newcommandverts[1]leftvert,#1,rightvert$

            beginalign
            &bbox[10px,#ffd]int_0^2piAsinparsomega t + phi,ddphi over 2pi =
            left.phantomLarge A
            -,A over 2pi,cosparsomega t + phi,rightvert_ phi = 0^ phi = 2pi
            \[5mm] = &
            -,A over 2pi,cosparsomega t + 2pi + A over 2pi,cosparsomega t + 0 =
            bbx0
            endalign




            $dscos$ is a periodic function of period $ds2pi$.







            share|cite|improve this answer









            $endgroup$













              Your Answer





              StackExchange.ifUsing("editor", function ()
              return StackExchange.using("mathjaxEditing", function ()
              StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
              StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
              );
              );
              , "mathjax-editing");

              StackExchange.ready(function()
              var channelOptions =
              tags: "".split(" "),
              id: "69"
              ;
              initTagRenderer("".split(" "), "".split(" "), channelOptions);

              StackExchange.using("externalEditor", function()
              // Have to fire editor after snippets, if snippets enabled
              if (StackExchange.settings.snippets.snippetsEnabled)
              StackExchange.using("snippets", function()
              createEditor();
              );

              else
              createEditor();

              );

              function createEditor()
              StackExchange.prepareEditor(
              heartbeatType: 'answer',
              autoActivateHeartbeat: false,
              convertImagesToLinks: true,
              noModals: true,
              showLowRepImageUploadWarning: true,
              reputationToPostImages: 10,
              bindNavPrevention: true,
              postfix: "",
              imageUploader:
              brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
              contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
              allowUrls: true
              ,
              noCode: true, onDemand: true,
              discardSelector: ".discard-answer"
              ,immediatelyShowMarkdownHelp:true
              );



              );













              draft saved

              draft discarded


















              StackExchange.ready(
              function ()
              StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3154594%2fexpectation-of-random-process%23new-answer', 'question_page');

              );

              Post as a guest















              Required, but never shown

























              2 Answers
              2






              active

              oldest

              votes








              2 Answers
              2






              active

              oldest

              votes









              active

              oldest

              votes






              active

              oldest

              votes









              1












              $begingroup$

              Yes, if $(A,omega,phi)$ is mutually independent.



              Indeed, if $(A,omega,phi)$ is mutually independent, then $A$ is independent of $(omega,phi)$ so you can write $mathbb E[X(t)]=mathbb E[A]mathbb E[sin(omega t+phi)]$.



              Moreover, $omega$ is independent of $phi$ so
              $$
              mathbb E[sin(omega t+phi)]=mathbb E[sin(omega t)cos(phi)]+mathbb E[sin(phi)cos(omega t)]=mathbb E[sin(omega t)]mathbb E[cos(phi)]+mathbb E[sin(phi)]mathbb E[cos(omega t)]=0
              $$



              I used the formula $sin(a+b)=sin(a)cos(b)+sin(b)cos(a)$ because it requires the least knowledge in probability theory. If you are more familiar with the probability theory, here is another way to conclude: since $omega$ and $phi$ are independent, we have
              $$
              mathbb E[sin(omega t+phi)]=mathbb E[f(omega t)],
              $$

              where $f:mathbb Rtomathbb R$ is defined for all $xinmathbb R$ by $f(x)=mathbb E[sin(x+phi)]=0$.






              share|cite|improve this answer









              $endgroup$

















                1












                $begingroup$

                Yes, if $(A,omega,phi)$ is mutually independent.



                Indeed, if $(A,omega,phi)$ is mutually independent, then $A$ is independent of $(omega,phi)$ so you can write $mathbb E[X(t)]=mathbb E[A]mathbb E[sin(omega t+phi)]$.



                Moreover, $omega$ is independent of $phi$ so
                $$
                mathbb E[sin(omega t+phi)]=mathbb E[sin(omega t)cos(phi)]+mathbb E[sin(phi)cos(omega t)]=mathbb E[sin(omega t)]mathbb E[cos(phi)]+mathbb E[sin(phi)]mathbb E[cos(omega t)]=0
                $$



                I used the formula $sin(a+b)=sin(a)cos(b)+sin(b)cos(a)$ because it requires the least knowledge in probability theory. If you are more familiar with the probability theory, here is another way to conclude: since $omega$ and $phi$ are independent, we have
                $$
                mathbb E[sin(omega t+phi)]=mathbb E[f(omega t)],
                $$

                where $f:mathbb Rtomathbb R$ is defined for all $xinmathbb R$ by $f(x)=mathbb E[sin(x+phi)]=0$.






                share|cite|improve this answer









                $endgroup$















                  1












                  1








                  1





                  $begingroup$

                  Yes, if $(A,omega,phi)$ is mutually independent.



                  Indeed, if $(A,omega,phi)$ is mutually independent, then $A$ is independent of $(omega,phi)$ so you can write $mathbb E[X(t)]=mathbb E[A]mathbb E[sin(omega t+phi)]$.



                  Moreover, $omega$ is independent of $phi$ so
                  $$
                  mathbb E[sin(omega t+phi)]=mathbb E[sin(omega t)cos(phi)]+mathbb E[sin(phi)cos(omega t)]=mathbb E[sin(omega t)]mathbb E[cos(phi)]+mathbb E[sin(phi)]mathbb E[cos(omega t)]=0
                  $$



                  I used the formula $sin(a+b)=sin(a)cos(b)+sin(b)cos(a)$ because it requires the least knowledge in probability theory. If you are more familiar with the probability theory, here is another way to conclude: since $omega$ and $phi$ are independent, we have
                  $$
                  mathbb E[sin(omega t+phi)]=mathbb E[f(omega t)],
                  $$

                  where $f:mathbb Rtomathbb R$ is defined for all $xinmathbb R$ by $f(x)=mathbb E[sin(x+phi)]=0$.






                  share|cite|improve this answer









                  $endgroup$



                  Yes, if $(A,omega,phi)$ is mutually independent.



                  Indeed, if $(A,omega,phi)$ is mutually independent, then $A$ is independent of $(omega,phi)$ so you can write $mathbb E[X(t)]=mathbb E[A]mathbb E[sin(omega t+phi)]$.



                  Moreover, $omega$ is independent of $phi$ so
                  $$
                  mathbb E[sin(omega t+phi)]=mathbb E[sin(omega t)cos(phi)]+mathbb E[sin(phi)cos(omega t)]=mathbb E[sin(omega t)]mathbb E[cos(phi)]+mathbb E[sin(phi)]mathbb E[cos(omega t)]=0
                  $$



                  I used the formula $sin(a+b)=sin(a)cos(b)+sin(b)cos(a)$ because it requires the least knowledge in probability theory. If you are more familiar with the probability theory, here is another way to conclude: since $omega$ and $phi$ are independent, we have
                  $$
                  mathbb E[sin(omega t+phi)]=mathbb E[f(omega t)],
                  $$

                  where $f:mathbb Rtomathbb R$ is defined for all $xinmathbb R$ by $f(x)=mathbb E[sin(x+phi)]=0$.







                  share|cite|improve this answer












                  share|cite|improve this answer



                  share|cite|improve this answer










                  answered Mar 19 at 21:30









                  WillWill

                  5115




                  5115





















                      0












                      $begingroup$

                      $newcommandbbx[1],bbox[15px,border:1px groove navy]displaystyle#1,
                      newcommandbraces[1]leftlbrace,#1,rightrbrace
                      newcommandbracks[1]leftlbrack,#1,rightrbrack
                      newcommandddmathrmd
                      newcommandds[1]displaystyle#1
                      newcommandexpo[1],mathrme^#1,
                      newcommandicmathrmi
                      newcommandmc[1]mathcal#1
                      newcommandmrm[1]mathrm#1
                      newcommandpars[1]left(,#1,right)
                      newcommandpartiald[3][]fracpartial^#1 #2partial #3^#1
                      newcommandroot[2][],sqrt[#1],#2,,
                      newcommandtotald[3][]fracmathrmd^#1 #2mathrmd #3^#1
                      newcommandverts[1]leftvert,#1,rightvert$

                      beginalign
                      &bbox[10px,#ffd]int_0^2piAsinparsomega t + phi,ddphi over 2pi =
                      left.phantomLarge A
                      -,A over 2pi,cosparsomega t + phi,rightvert_ phi = 0^ phi = 2pi
                      \[5mm] = &
                      -,A over 2pi,cosparsomega t + 2pi + A over 2pi,cosparsomega t + 0 =
                      bbx0
                      endalign




                      $dscos$ is a periodic function of period $ds2pi$.







                      share|cite|improve this answer









                      $endgroup$

















                        0












                        $begingroup$

                        $newcommandbbx[1],bbox[15px,border:1px groove navy]displaystyle#1,
                        newcommandbraces[1]leftlbrace,#1,rightrbrace
                        newcommandbracks[1]leftlbrack,#1,rightrbrack
                        newcommandddmathrmd
                        newcommandds[1]displaystyle#1
                        newcommandexpo[1],mathrme^#1,
                        newcommandicmathrmi
                        newcommandmc[1]mathcal#1
                        newcommandmrm[1]mathrm#1
                        newcommandpars[1]left(,#1,right)
                        newcommandpartiald[3][]fracpartial^#1 #2partial #3^#1
                        newcommandroot[2][],sqrt[#1],#2,,
                        newcommandtotald[3][]fracmathrmd^#1 #2mathrmd #3^#1
                        newcommandverts[1]leftvert,#1,rightvert$

                        beginalign
                        &bbox[10px,#ffd]int_0^2piAsinparsomega t + phi,ddphi over 2pi =
                        left.phantomLarge A
                        -,A over 2pi,cosparsomega t + phi,rightvert_ phi = 0^ phi = 2pi
                        \[5mm] = &
                        -,A over 2pi,cosparsomega t + 2pi + A over 2pi,cosparsomega t + 0 =
                        bbx0
                        endalign




                        $dscos$ is a periodic function of period $ds2pi$.







                        share|cite|improve this answer









                        $endgroup$















                          0












                          0








                          0





                          $begingroup$

                          $newcommandbbx[1],bbox[15px,border:1px groove navy]displaystyle#1,
                          newcommandbraces[1]leftlbrace,#1,rightrbrace
                          newcommandbracks[1]leftlbrack,#1,rightrbrack
                          newcommandddmathrmd
                          newcommandds[1]displaystyle#1
                          newcommandexpo[1],mathrme^#1,
                          newcommandicmathrmi
                          newcommandmc[1]mathcal#1
                          newcommandmrm[1]mathrm#1
                          newcommandpars[1]left(,#1,right)
                          newcommandpartiald[3][]fracpartial^#1 #2partial #3^#1
                          newcommandroot[2][],sqrt[#1],#2,,
                          newcommandtotald[3][]fracmathrmd^#1 #2mathrmd #3^#1
                          newcommandverts[1]leftvert,#1,rightvert$

                          beginalign
                          &bbox[10px,#ffd]int_0^2piAsinparsomega t + phi,ddphi over 2pi =
                          left.phantomLarge A
                          -,A over 2pi,cosparsomega t + phi,rightvert_ phi = 0^ phi = 2pi
                          \[5mm] = &
                          -,A over 2pi,cosparsomega t + 2pi + A over 2pi,cosparsomega t + 0 =
                          bbx0
                          endalign




                          $dscos$ is a periodic function of period $ds2pi$.







                          share|cite|improve this answer









                          $endgroup$



                          $newcommandbbx[1],bbox[15px,border:1px groove navy]displaystyle#1,
                          newcommandbraces[1]leftlbrace,#1,rightrbrace
                          newcommandbracks[1]leftlbrack,#1,rightrbrack
                          newcommandddmathrmd
                          newcommandds[1]displaystyle#1
                          newcommandexpo[1],mathrme^#1,
                          newcommandicmathrmi
                          newcommandmc[1]mathcal#1
                          newcommandmrm[1]mathrm#1
                          newcommandpars[1]left(,#1,right)
                          newcommandpartiald[3][]fracpartial^#1 #2partial #3^#1
                          newcommandroot[2][],sqrt[#1],#2,,
                          newcommandtotald[3][]fracmathrmd^#1 #2mathrmd #3^#1
                          newcommandverts[1]leftvert,#1,rightvert$

                          beginalign
                          &bbox[10px,#ffd]int_0^2piAsinparsomega t + phi,ddphi over 2pi =
                          left.phantomLarge A
                          -,A over 2pi,cosparsomega t + phi,rightvert_ phi = 0^ phi = 2pi
                          \[5mm] = &
                          -,A over 2pi,cosparsomega t + 2pi + A over 2pi,cosparsomega t + 0 =
                          bbx0
                          endalign




                          $dscos$ is a periodic function of period $ds2pi$.








                          share|cite|improve this answer












                          share|cite|improve this answer



                          share|cite|improve this answer










                          answered Mar 19 at 23:51









                          Felix MarinFelix Marin

                          68.9k7109146




                          68.9k7109146



























                              draft saved

                              draft discarded
















































                              Thanks for contributing an answer to Mathematics Stack Exchange!


                              • Please be sure to answer the question. Provide details and share your research!

                              But avoid


                              • Asking for help, clarification, or responding to other answers.

                              • Making statements based on opinion; back them up with references or personal experience.

                              Use MathJax to format equations. MathJax reference.


                              To learn more, see our tips on writing great answers.




                              draft saved


                              draft discarded














                              StackExchange.ready(
                              function ()
                              StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3154594%2fexpectation-of-random-process%23new-answer', 'question_page');

                              );

                              Post as a guest















                              Required, but never shown





















































                              Required, but never shown














                              Required, but never shown












                              Required, but never shown







                              Required, but never shown

































                              Required, but never shown














                              Required, but never shown












                              Required, but never shown







                              Required, but never shown







                              Popular posts from this blog

                              How should I support this large drywall patch? Planned maintenance scheduled April 23, 2019 at 00:00UTC (8:00pm US/Eastern) Announcing the arrival of Valued Associate #679: Cesar Manara Unicorn Meta Zoo #1: Why another podcast?How do I cover large gaps in drywall?How do I keep drywall around a patch from crumbling?Can I glue a second layer of drywall?How to patch long strip on drywall?Large drywall patch: how to avoid bulging seams?Drywall Mesh Patch vs. Bulge? To remove or not to remove?How to fix this drywall job?Prep drywall before backsplashWhat's the best way to fix this horrible drywall patch job?Drywall patching using 3M Patch Plus Primer

                              random experiment with two different functions on unit interval Announcing the arrival of Valued Associate #679: Cesar Manara Planned maintenance scheduled April 23, 2019 at 00:00UTC (8:00pm US/Eastern)Random variable and probability space notionsRandom Walk with EdgesFinding functions where the increase over a random interval is Poisson distributedNumber of days until dayCan an observed event in fact be of zero probability?Unit random processmodels of coins and uniform distributionHow to get the number of successes given $n$ trials , probability $P$ and a random variable $X$Absorbing Markov chain in a computer. Is “almost every” turned into always convergence in computer executions?Stopped random walk is not uniformly integrable

                              Lowndes Grove History Architecture References Navigation menu32°48′6″N 79°57′58″W / 32.80167°N 79.96611°W / 32.80167; -79.9661132°48′6″N 79°57′58″W / 32.80167°N 79.96611°W / 32.80167; -79.9661178002500"National Register Information System"Historic houses of South Carolina"Lowndes Grove""+32° 48' 6.00", −79° 57' 58.00""Lowndes Grove, Charleston County (260 St. Margaret St., Charleston)""Lowndes Grove"The Charleston ExpositionIt Happened in South Carolina"Lowndes Grove (House), Saint Margaret Street & Sixth Avenue, Charleston, Charleston County, SC(Photographs)"Plantations of the Carolina Low Countrye