Expectation of random process The Next CEO of Stack OverflowCompound Poisson process: calculate $Eleft( sum_k=1^N_tX_k e^t-T_k right)$, $X_k$ i.i.d., $T_k$ arrival timeexpectation value for minimum distance between random variablesCan sum of two random variables be uniformly distributedUncountable family of random variablesExpectation over 2 random variables, help neededRenewal process - sample spaceIs multiplication of a correlated random variable and a independent random variable, an independent random variableMean Value of a Random ProcessMean and Variance, Uniformly distributed random variablesWeak stationarity for a stochastic process $X(t) $
Is there a reasonable and studied concept of reduction between regular languages?
What flight has the highest ratio of timezone difference to flight time?
Can Sneak Attack be used when hitting with an improvised weapon?
Help! I cannot understand this game’s notations!
Why am I getting "Static method cannot be referenced from a non static context: String String.valueOf(Object)"?
How to Implement Deterministic Encryption Safely in .NET
Does the Idaho Potato Commission associate potato skins with healthy eating?
Can you teleport closer to a creature you are Frightened of?
Can I board the first leg of the flight without having final country's visa?
Traveling with my 5 year old daughter (as the father) without the mother from Germany to Mexico
What would be the main consequences for a country leaving the WTO?
Is it okay to majorly distort historical facts while writing a fiction story?
Why is information "lost" when it got into a black hole?
Help understanding this unsettling image of Titan, Epimetheus, and Saturn's rings?
Where do students learn to solve polynomial equations these days?
Purpose of level-shifter with same in and out voltages
Help/tips for a first time writer?
Which one is the true statement?
Computationally populating tables with probability data
How to find image of a complex function with given constraints?
Spaces in which all closed sets are regular closed
(How) Could a medieval fantasy world survive a magic-induced "nuclear winter"?
In the "Harry Potter and the Order of the Phoenix" video game, what potion is used to sabotage Umbridge's speakers?
Defamation due to breach of confidentiality
Expectation of random process
The Next CEO of Stack OverflowCompound Poisson process: calculate $Eleft( sum_k=1^N_tX_k e^t-T_k right)$, $X_k$ i.i.d., $T_k$ arrival timeexpectation value for minimum distance between random variablesCan sum of two random variables be uniformly distributedUncountable family of random variablesExpectation over 2 random variables, help neededRenewal process - sample spaceIs multiplication of a correlated random variable and a independent random variable, an independent random variableMean Value of a Random ProcessMean and Variance, Uniformly distributed random variablesWeak stationarity for a stochastic process $X(t) $
$begingroup$
I have a random process defined as
$$X(t) = A sin(omega t + phi)$$
where A and $omega$ are independent. $phi$ is distributed $U[0,2pi]$.
I would like to find $E[X(t)]$.
I believe the answer is $0$ because the multiplication of 2 expectation of independent random variables is the product of the expectation of each. i.e. $E[X(t)] = E[A] E[sin(omega t + phi)]$.
Since $phi$ is distributed uniformly over a cycle, the mean of the $sin()$ term must be zero thereby making the whole quantity 0.
Is this correct?
stochastic-processes random-variables
$endgroup$
add a comment |
$begingroup$
I have a random process defined as
$$X(t) = A sin(omega t + phi)$$
where A and $omega$ are independent. $phi$ is distributed $U[0,2pi]$.
I would like to find $E[X(t)]$.
I believe the answer is $0$ because the multiplication of 2 expectation of independent random variables is the product of the expectation of each. i.e. $E[X(t)] = E[A] E[sin(omega t + phi)]$.
Since $phi$ is distributed uniformly over a cycle, the mean of the $sin()$ term must be zero thereby making the whole quantity 0.
Is this correct?
stochastic-processes random-variables
$endgroup$
add a comment |
$begingroup$
I have a random process defined as
$$X(t) = A sin(omega t + phi)$$
where A and $omega$ are independent. $phi$ is distributed $U[0,2pi]$.
I would like to find $E[X(t)]$.
I believe the answer is $0$ because the multiplication of 2 expectation of independent random variables is the product of the expectation of each. i.e. $E[X(t)] = E[A] E[sin(omega t + phi)]$.
Since $phi$ is distributed uniformly over a cycle, the mean of the $sin()$ term must be zero thereby making the whole quantity 0.
Is this correct?
stochastic-processes random-variables
$endgroup$
I have a random process defined as
$$X(t) = A sin(omega t + phi)$$
where A and $omega$ are independent. $phi$ is distributed $U[0,2pi]$.
I would like to find $E[X(t)]$.
I believe the answer is $0$ because the multiplication of 2 expectation of independent random variables is the product of the expectation of each. i.e. $E[X(t)] = E[A] E[sin(omega t + phi)]$.
Since $phi$ is distributed uniformly over a cycle, the mean of the $sin()$ term must be zero thereby making the whole quantity 0.
Is this correct?
stochastic-processes random-variables
stochastic-processes random-variables
asked Mar 19 at 20:51
AvedisAvedis
667
667
add a comment |
add a comment |
2 Answers
2
active
oldest
votes
$begingroup$
Yes, if $(A,omega,phi)$ is mutually independent.
Indeed, if $(A,omega,phi)$ is mutually independent, then $A$ is independent of $(omega,phi)$ so you can write $mathbb E[X(t)]=mathbb E[A]mathbb E[sin(omega t+phi)]$.
Moreover, $omega$ is independent of $phi$ so
$$
mathbb E[sin(omega t+phi)]=mathbb E[sin(omega t)cos(phi)]+mathbb E[sin(phi)cos(omega t)]=mathbb E[sin(omega t)]mathbb E[cos(phi)]+mathbb E[sin(phi)]mathbb E[cos(omega t)]=0
$$
I used the formula $sin(a+b)=sin(a)cos(b)+sin(b)cos(a)$ because it requires the least knowledge in probability theory. If you are more familiar with the probability theory, here is another way to conclude: since $omega$ and $phi$ are independent, we have
$$
mathbb E[sin(omega t+phi)]=mathbb E[f(omega t)],
$$
where $f:mathbb Rtomathbb R$ is defined for all $xinmathbb R$ by $f(x)=mathbb E[sin(x+phi)]=0$.
$endgroup$
add a comment |
$begingroup$
$newcommandbbx[1],bbox[15px,border:1px groove navy]displaystyle#1,
newcommandbraces[1]leftlbrace,#1,rightrbrace
newcommandbracks[1]leftlbrack,#1,rightrbrack
newcommandddmathrmd
newcommandds[1]displaystyle#1
newcommandexpo[1],mathrme^#1,
newcommandicmathrmi
newcommandmc[1]mathcal#1
newcommandmrm[1]mathrm#1
newcommandpars[1]left(,#1,right)
newcommandpartiald[3][]fracpartial^#1 #2partial #3^#1
newcommandroot[2][],sqrt[#1],#2,,
newcommandtotald[3][]fracmathrmd^#1 #2mathrmd #3^#1
newcommandverts[1]leftvert,#1,rightvert$
beginalign
&bbox[10px,#ffd]int_0^2piAsinparsomega t + phi,ddphi over 2pi =
left.phantomLarge A
-,A over 2pi,cosparsomega t + phi,rightvert_ phi = 0^ phi = 2pi
\[5mm] = &
-,A over 2pi,cosparsomega t + 2pi + A over 2pi,cosparsomega t + 0 =
bbx0
endalign
$dscos$ is a periodic function of period $ds2pi$.
$endgroup$
add a comment |
StackExchange.ifUsing("editor", function ()
return StackExchange.using("mathjaxEditing", function ()
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
);
);
, "mathjax-editing");
StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "69"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);
else
createEditor();
);
function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);
);
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3154594%2fexpectation-of-random-process%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
Yes, if $(A,omega,phi)$ is mutually independent.
Indeed, if $(A,omega,phi)$ is mutually independent, then $A$ is independent of $(omega,phi)$ so you can write $mathbb E[X(t)]=mathbb E[A]mathbb E[sin(omega t+phi)]$.
Moreover, $omega$ is independent of $phi$ so
$$
mathbb E[sin(omega t+phi)]=mathbb E[sin(omega t)cos(phi)]+mathbb E[sin(phi)cos(omega t)]=mathbb E[sin(omega t)]mathbb E[cos(phi)]+mathbb E[sin(phi)]mathbb E[cos(omega t)]=0
$$
I used the formula $sin(a+b)=sin(a)cos(b)+sin(b)cos(a)$ because it requires the least knowledge in probability theory. If you are more familiar with the probability theory, here is another way to conclude: since $omega$ and $phi$ are independent, we have
$$
mathbb E[sin(omega t+phi)]=mathbb E[f(omega t)],
$$
where $f:mathbb Rtomathbb R$ is defined for all $xinmathbb R$ by $f(x)=mathbb E[sin(x+phi)]=0$.
$endgroup$
add a comment |
$begingroup$
Yes, if $(A,omega,phi)$ is mutually independent.
Indeed, if $(A,omega,phi)$ is mutually independent, then $A$ is independent of $(omega,phi)$ so you can write $mathbb E[X(t)]=mathbb E[A]mathbb E[sin(omega t+phi)]$.
Moreover, $omega$ is independent of $phi$ so
$$
mathbb E[sin(omega t+phi)]=mathbb E[sin(omega t)cos(phi)]+mathbb E[sin(phi)cos(omega t)]=mathbb E[sin(omega t)]mathbb E[cos(phi)]+mathbb E[sin(phi)]mathbb E[cos(omega t)]=0
$$
I used the formula $sin(a+b)=sin(a)cos(b)+sin(b)cos(a)$ because it requires the least knowledge in probability theory. If you are more familiar with the probability theory, here is another way to conclude: since $omega$ and $phi$ are independent, we have
$$
mathbb E[sin(omega t+phi)]=mathbb E[f(omega t)],
$$
where $f:mathbb Rtomathbb R$ is defined for all $xinmathbb R$ by $f(x)=mathbb E[sin(x+phi)]=0$.
$endgroup$
add a comment |
$begingroup$
Yes, if $(A,omega,phi)$ is mutually independent.
Indeed, if $(A,omega,phi)$ is mutually independent, then $A$ is independent of $(omega,phi)$ so you can write $mathbb E[X(t)]=mathbb E[A]mathbb E[sin(omega t+phi)]$.
Moreover, $omega$ is independent of $phi$ so
$$
mathbb E[sin(omega t+phi)]=mathbb E[sin(omega t)cos(phi)]+mathbb E[sin(phi)cos(omega t)]=mathbb E[sin(omega t)]mathbb E[cos(phi)]+mathbb E[sin(phi)]mathbb E[cos(omega t)]=0
$$
I used the formula $sin(a+b)=sin(a)cos(b)+sin(b)cos(a)$ because it requires the least knowledge in probability theory. If you are more familiar with the probability theory, here is another way to conclude: since $omega$ and $phi$ are independent, we have
$$
mathbb E[sin(omega t+phi)]=mathbb E[f(omega t)],
$$
where $f:mathbb Rtomathbb R$ is defined for all $xinmathbb R$ by $f(x)=mathbb E[sin(x+phi)]=0$.
$endgroup$
Yes, if $(A,omega,phi)$ is mutually independent.
Indeed, if $(A,omega,phi)$ is mutually independent, then $A$ is independent of $(omega,phi)$ so you can write $mathbb E[X(t)]=mathbb E[A]mathbb E[sin(omega t+phi)]$.
Moreover, $omega$ is independent of $phi$ so
$$
mathbb E[sin(omega t+phi)]=mathbb E[sin(omega t)cos(phi)]+mathbb E[sin(phi)cos(omega t)]=mathbb E[sin(omega t)]mathbb E[cos(phi)]+mathbb E[sin(phi)]mathbb E[cos(omega t)]=0
$$
I used the formula $sin(a+b)=sin(a)cos(b)+sin(b)cos(a)$ because it requires the least knowledge in probability theory. If you are more familiar with the probability theory, here is another way to conclude: since $omega$ and $phi$ are independent, we have
$$
mathbb E[sin(omega t+phi)]=mathbb E[f(omega t)],
$$
where $f:mathbb Rtomathbb R$ is defined for all $xinmathbb R$ by $f(x)=mathbb E[sin(x+phi)]=0$.
answered Mar 19 at 21:30
WillWill
5115
5115
add a comment |
add a comment |
$begingroup$
$newcommandbbx[1],bbox[15px,border:1px groove navy]displaystyle#1,
newcommandbraces[1]leftlbrace,#1,rightrbrace
newcommandbracks[1]leftlbrack,#1,rightrbrack
newcommandddmathrmd
newcommandds[1]displaystyle#1
newcommandexpo[1],mathrme^#1,
newcommandicmathrmi
newcommandmc[1]mathcal#1
newcommandmrm[1]mathrm#1
newcommandpars[1]left(,#1,right)
newcommandpartiald[3][]fracpartial^#1 #2partial #3^#1
newcommandroot[2][],sqrt[#1],#2,,
newcommandtotald[3][]fracmathrmd^#1 #2mathrmd #3^#1
newcommandverts[1]leftvert,#1,rightvert$
beginalign
&bbox[10px,#ffd]int_0^2piAsinparsomega t + phi,ddphi over 2pi =
left.phantomLarge A
-,A over 2pi,cosparsomega t + phi,rightvert_ phi = 0^ phi = 2pi
\[5mm] = &
-,A over 2pi,cosparsomega t + 2pi + A over 2pi,cosparsomega t + 0 =
bbx0
endalign
$dscos$ is a periodic function of period $ds2pi$.
$endgroup$
add a comment |
$begingroup$
$newcommandbbx[1],bbox[15px,border:1px groove navy]displaystyle#1,
newcommandbraces[1]leftlbrace,#1,rightrbrace
newcommandbracks[1]leftlbrack,#1,rightrbrack
newcommandddmathrmd
newcommandds[1]displaystyle#1
newcommandexpo[1],mathrme^#1,
newcommandicmathrmi
newcommandmc[1]mathcal#1
newcommandmrm[1]mathrm#1
newcommandpars[1]left(,#1,right)
newcommandpartiald[3][]fracpartial^#1 #2partial #3^#1
newcommandroot[2][],sqrt[#1],#2,,
newcommandtotald[3][]fracmathrmd^#1 #2mathrmd #3^#1
newcommandverts[1]leftvert,#1,rightvert$
beginalign
&bbox[10px,#ffd]int_0^2piAsinparsomega t + phi,ddphi over 2pi =
left.phantomLarge A
-,A over 2pi,cosparsomega t + phi,rightvert_ phi = 0^ phi = 2pi
\[5mm] = &
-,A over 2pi,cosparsomega t + 2pi + A over 2pi,cosparsomega t + 0 =
bbx0
endalign
$dscos$ is a periodic function of period $ds2pi$.
$endgroup$
add a comment |
$begingroup$
$newcommandbbx[1],bbox[15px,border:1px groove navy]displaystyle#1,
newcommandbraces[1]leftlbrace,#1,rightrbrace
newcommandbracks[1]leftlbrack,#1,rightrbrack
newcommandddmathrmd
newcommandds[1]displaystyle#1
newcommandexpo[1],mathrme^#1,
newcommandicmathrmi
newcommandmc[1]mathcal#1
newcommandmrm[1]mathrm#1
newcommandpars[1]left(,#1,right)
newcommandpartiald[3][]fracpartial^#1 #2partial #3^#1
newcommandroot[2][],sqrt[#1],#2,,
newcommandtotald[3][]fracmathrmd^#1 #2mathrmd #3^#1
newcommandverts[1]leftvert,#1,rightvert$
beginalign
&bbox[10px,#ffd]int_0^2piAsinparsomega t + phi,ddphi over 2pi =
left.phantomLarge A
-,A over 2pi,cosparsomega t + phi,rightvert_ phi = 0^ phi = 2pi
\[5mm] = &
-,A over 2pi,cosparsomega t + 2pi + A over 2pi,cosparsomega t + 0 =
bbx0
endalign
$dscos$ is a periodic function of period $ds2pi$.
$endgroup$
$newcommandbbx[1],bbox[15px,border:1px groove navy]displaystyle#1,
newcommandbraces[1]leftlbrace,#1,rightrbrace
newcommandbracks[1]leftlbrack,#1,rightrbrack
newcommandddmathrmd
newcommandds[1]displaystyle#1
newcommandexpo[1],mathrme^#1,
newcommandicmathrmi
newcommandmc[1]mathcal#1
newcommandmrm[1]mathrm#1
newcommandpars[1]left(,#1,right)
newcommandpartiald[3][]fracpartial^#1 #2partial #3^#1
newcommandroot[2][],sqrt[#1],#2,,
newcommandtotald[3][]fracmathrmd^#1 #2mathrmd #3^#1
newcommandverts[1]leftvert,#1,rightvert$
beginalign
&bbox[10px,#ffd]int_0^2piAsinparsomega t + phi,ddphi over 2pi =
left.phantomLarge A
-,A over 2pi,cosparsomega t + phi,rightvert_ phi = 0^ phi = 2pi
\[5mm] = &
-,A over 2pi,cosparsomega t + 2pi + A over 2pi,cosparsomega t + 0 =
bbx0
endalign
$dscos$ is a periodic function of period $ds2pi$.
answered Mar 19 at 23:51
Felix MarinFelix Marin
68.9k7109146
68.9k7109146
add a comment |
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3154594%2fexpectation-of-random-process%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown