Is it true that the sum of zero mean i.i.d. random variables oscillates around $0$ infinitely often? The Next CEO of Stack OverflowCentral Limit Theorem and sum of squared random variablesIs Lindeberg's condition satisfied when variances of the sequence of r.v.'s are bounded from above and below?Sum of sequence of random variables infinitely often positive$S_n in [-a,a]$ for some $a$ infinitely oftenAlmost sure convergence of a martingaleAlmost sure convergence of a compound sum of random variablesSum of sequence of random variables infinitely often equals 1Sum of random variables is equal to zero infinitely oftenExpected time until random walk with positive drift crosses a positive boundaryA.S. convergence of sum of square-integrable independent random variables with summable variation
How to use ReplaceAll on an expression that contains a rule
What happened in Rome, when the western empire "fell"?
Help! I cannot understand this game’s notations!
What would be the main consequences for a country leaving the WTO?
My ex-girlfriend uses my Apple ID to login to her iPad, do I have to give her my Apple ID password to reset it?
Does the Idaho Potato Commission associate potato skins with healthy eating?
Do I need to write [sic] when including a quotation with a number less than 10 that isn't written out?
What flight has the highest ratio of timezone difference to flight time?
Expressing the idea of having a very busy time
Why am I getting "Static method cannot be referenced from a non static context: String String.valueOf(Object)"?
How do I fit a non linear curve?
how one can write a nice vector parser, something that does pgfvecparseA=B-C; D=E x F;
What is the process for cleansing a very negative action
Is there such a thing as a proper verb, like a proper noun?
Is there a difference between "Fahrstuhl" and "Aufzug"?
Do scriptures give a method to recognize a truly self-realized person/jivanmukta?
Is dried pee considered dirt?
It is correct to match light sources with the same color temperature?
Is it okay to majorly distort historical facts while writing a fiction story?
Is French Guiana a (hard) EU border?
Why is the US ranked as #45 in Press Freedom ratings, despite its extremely permissive free speech laws?
IC has pull-down resistors on SMBus lines?
Spaces in which all closed sets are regular closed
What day is it again?
Is it true that the sum of zero mean i.i.d. random variables oscillates around $0$ infinitely often?
The Next CEO of Stack OverflowCentral Limit Theorem and sum of squared random variablesIs Lindeberg's condition satisfied when variances of the sequence of r.v.'s are bounded from above and below?Sum of sequence of random variables infinitely often positive$S_n in [-a,a]$ for some $a$ infinitely oftenAlmost sure convergence of a martingaleAlmost sure convergence of a compound sum of random variablesSum of sequence of random variables infinitely often equals 1Sum of random variables is equal to zero infinitely oftenExpected time until random walk with positive drift crosses a positive boundaryA.S. convergence of sum of square-integrable independent random variables with summable variation
$begingroup$
Is this true in general? (Without assuming existence of variance etc.). Only thing we know is that the r.v.s are positive with a non-zero probability and that they are integer valued, integrable with mean 0.
Only thing I could come up with was that $S_n/n$ goes to 0 (where $S_n$ is the partial sum). But this can happen even if $S_n$ was always positive.
My intuition is that this should be true because it happens with Simple Random Walks (which have a unit step size), and a larger step size should only increase the probability of it becoming negative if it was positive before. I am not sure how to formalize this. Can someone please help?
probability-theory
$endgroup$
add a comment |
$begingroup$
Is this true in general? (Without assuming existence of variance etc.). Only thing we know is that the r.v.s are positive with a non-zero probability and that they are integer valued, integrable with mean 0.
Only thing I could come up with was that $S_n/n$ goes to 0 (where $S_n$ is the partial sum). But this can happen even if $S_n$ was always positive.
My intuition is that this should be true because it happens with Simple Random Walks (which have a unit step size), and a larger step size should only increase the probability of it becoming negative if it was positive before. I am not sure how to formalize this. Can someone please help?
probability-theory
$endgroup$
1
$begingroup$
@Ian That would require finite variance, though, I believe.
$endgroup$
– leonbloy
Mar 19 at 21:23
2
$begingroup$
There is a proof walkthrough in Exercises 4.1.8 through 4.1.11 in Probability: Theory and Examples by Durrett.
$endgroup$
– Mike Earnest
Mar 19 at 22:28
$begingroup$
How could a positive r.v. have a zero mean?
$endgroup$
– Mostafa Ayaz
Mar 19 at 23:17
$begingroup$
@MostafaAyaz They are positive with a non-zero probability. That probability has to be less than one for it to have zero mean.
$endgroup$
– cauthon14
Mar 19 at 23:29
add a comment |
$begingroup$
Is this true in general? (Without assuming existence of variance etc.). Only thing we know is that the r.v.s are positive with a non-zero probability and that they are integer valued, integrable with mean 0.
Only thing I could come up with was that $S_n/n$ goes to 0 (where $S_n$ is the partial sum). But this can happen even if $S_n$ was always positive.
My intuition is that this should be true because it happens with Simple Random Walks (which have a unit step size), and a larger step size should only increase the probability of it becoming negative if it was positive before. I am not sure how to formalize this. Can someone please help?
probability-theory
$endgroup$
Is this true in general? (Without assuming existence of variance etc.). Only thing we know is that the r.v.s are positive with a non-zero probability and that they are integer valued, integrable with mean 0.
Only thing I could come up with was that $S_n/n$ goes to 0 (where $S_n$ is the partial sum). But this can happen even if $S_n$ was always positive.
My intuition is that this should be true because it happens with Simple Random Walks (which have a unit step size), and a larger step size should only increase the probability of it becoming negative if it was positive before. I am not sure how to formalize this. Can someone please help?
probability-theory
probability-theory
edited Mar 19 at 21:28
leonbloy
42k647109
42k647109
asked Mar 19 at 21:17
cauthon14cauthon14
616
616
1
$begingroup$
@Ian That would require finite variance, though, I believe.
$endgroup$
– leonbloy
Mar 19 at 21:23
2
$begingroup$
There is a proof walkthrough in Exercises 4.1.8 through 4.1.11 in Probability: Theory and Examples by Durrett.
$endgroup$
– Mike Earnest
Mar 19 at 22:28
$begingroup$
How could a positive r.v. have a zero mean?
$endgroup$
– Mostafa Ayaz
Mar 19 at 23:17
$begingroup$
@MostafaAyaz They are positive with a non-zero probability. That probability has to be less than one for it to have zero mean.
$endgroup$
– cauthon14
Mar 19 at 23:29
add a comment |
1
$begingroup$
@Ian That would require finite variance, though, I believe.
$endgroup$
– leonbloy
Mar 19 at 21:23
2
$begingroup$
There is a proof walkthrough in Exercises 4.1.8 through 4.1.11 in Probability: Theory and Examples by Durrett.
$endgroup$
– Mike Earnest
Mar 19 at 22:28
$begingroup$
How could a positive r.v. have a zero mean?
$endgroup$
– Mostafa Ayaz
Mar 19 at 23:17
$begingroup$
@MostafaAyaz They are positive with a non-zero probability. That probability has to be less than one for it to have zero mean.
$endgroup$
– cauthon14
Mar 19 at 23:29
1
1
$begingroup$
@Ian That would require finite variance, though, I believe.
$endgroup$
– leonbloy
Mar 19 at 21:23
$begingroup$
@Ian That would require finite variance, though, I believe.
$endgroup$
– leonbloy
Mar 19 at 21:23
2
2
$begingroup$
There is a proof walkthrough in Exercises 4.1.8 through 4.1.11 in Probability: Theory and Examples by Durrett.
$endgroup$
– Mike Earnest
Mar 19 at 22:28
$begingroup$
There is a proof walkthrough in Exercises 4.1.8 through 4.1.11 in Probability: Theory and Examples by Durrett.
$endgroup$
– Mike Earnest
Mar 19 at 22:28
$begingroup$
How could a positive r.v. have a zero mean?
$endgroup$
– Mostafa Ayaz
Mar 19 at 23:17
$begingroup$
How could a positive r.v. have a zero mean?
$endgroup$
– Mostafa Ayaz
Mar 19 at 23:17
$begingroup$
@MostafaAyaz They are positive with a non-zero probability. That probability has to be less than one for it to have zero mean.
$endgroup$
– cauthon14
Mar 19 at 23:29
$begingroup$
@MostafaAyaz They are positive with a non-zero probability. That probability has to be less than one for it to have zero mean.
$endgroup$
– cauthon14
Mar 19 at 23:29
add a comment |
1 Answer
1
active
oldest
votes
$begingroup$
$S_n=X_1+dots+X_n$. This proof will work without the assumption that $X_i$ is integer valued. Let
$$
beginarraycc
alpha=inf n>0:S_n>0 &&beta=inf n>0:S_n<0\
alpha'=inf n>0:S_nge 0 &&beta'=inf n>0:S_nle 0
endarray
$$
Using Wald's equation, you can show that
$$
E[alpha]=E[beta]=infty.
$$
Otherwise, you would have $E[S_alpha]=E[X_1]E[alpha]=0cdot E[alpha]=0$, contradicting $S_alpha>0$.
Next is the real tricky part. For each $nge 0$, define $I_n$ to be the index $iin0,1,dots,n$ for which $S_i$ is minimized, with ties going to the latest such index. I claim that
$$
P(I_n=m)=P(alpha>m)P(beta'>n-m).
$$
To see this, note that $I_n=m$ is determined by the first $n$ steps $X_1,dots,X_n$ of the process, while $alpha>m$ is determined by the first $m$ steps and $beta'>n-m$ by the first $n-m$ steps. Furthermore, $I_n=m$ occurs for $(X_1,dots,X_n)$ if and only if $alpha>m$ occurs for $(X_m,X_m-1,dots,X_1)$ (note the reversal!) and $beta'>n-m$ occurs for $(X_m+1,X_m+2,dots,X_n)$. $square$
The payoff to that tricky Lemma is we can show
$alpha'$ and $beta'$ are almost surely finite.
We start with $$
1=sum_m=0^n P(I_n=m)=sum_m=0^n P(alpha>m)P(beta'>n-m)
$$
Now, let $ntoinfty$. Each summand $ P(alpha>m)P(beta'>n-m)$ converges to $P(alpha>m)P(beta'=infty)$, so we get
$$
E[alpha]=sum_m=0^infty P(alpha>m)=frac1P(beta'=infty)
$$
We already proved $E[alpha]=infty$, so this shows that $beta'$ is almost surely finite. The same goes for $alpha'$.
Finally,
$S_n$ is positive and negative infinitely often.
Just like $alpha'$ is the first time after $0$ the process is nonnegative, we define $alpha'(k)$ inductively to be the first time after $alpha'(k-1)$ that $S_nge S_alpha'(k-1)$. Now, the sequence of steps
$$
X_alpha'+1,X_alpha'(2)+1,X_alpha'(3)+1,dots
$$
are iid distributed like $X_1$. Assuming $X_1$ is nontrivial, these have a nonzero probability of being positive, so with probability $1$, infinitely many of them are positive. If $X_alpha'(k)+1$ is positive, then $S_alpha'(k)+1=S_alpha'(k)+X_alpha'(k)+1>0$. Therefore, there are infinitely many times of the form $S_alpha'(k)+1$ which are greater than $0$. The same goes for times where $S_n<0$.
$endgroup$
add a comment |
StackExchange.ifUsing("editor", function ()
return StackExchange.using("mathjaxEditing", function ()
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
);
);
, "mathjax-editing");
StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "69"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);
else
createEditor();
);
function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);
);
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3154621%2fis-it-true-that-the-sum-of-zero-mean-i-i-d-random-variables-oscillates-around%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
$S_n=X_1+dots+X_n$. This proof will work without the assumption that $X_i$ is integer valued. Let
$$
beginarraycc
alpha=inf n>0:S_n>0 &&beta=inf n>0:S_n<0\
alpha'=inf n>0:S_nge 0 &&beta'=inf n>0:S_nle 0
endarray
$$
Using Wald's equation, you can show that
$$
E[alpha]=E[beta]=infty.
$$
Otherwise, you would have $E[S_alpha]=E[X_1]E[alpha]=0cdot E[alpha]=0$, contradicting $S_alpha>0$.
Next is the real tricky part. For each $nge 0$, define $I_n$ to be the index $iin0,1,dots,n$ for which $S_i$ is minimized, with ties going to the latest such index. I claim that
$$
P(I_n=m)=P(alpha>m)P(beta'>n-m).
$$
To see this, note that $I_n=m$ is determined by the first $n$ steps $X_1,dots,X_n$ of the process, while $alpha>m$ is determined by the first $m$ steps and $beta'>n-m$ by the first $n-m$ steps. Furthermore, $I_n=m$ occurs for $(X_1,dots,X_n)$ if and only if $alpha>m$ occurs for $(X_m,X_m-1,dots,X_1)$ (note the reversal!) and $beta'>n-m$ occurs for $(X_m+1,X_m+2,dots,X_n)$. $square$
The payoff to that tricky Lemma is we can show
$alpha'$ and $beta'$ are almost surely finite.
We start with $$
1=sum_m=0^n P(I_n=m)=sum_m=0^n P(alpha>m)P(beta'>n-m)
$$
Now, let $ntoinfty$. Each summand $ P(alpha>m)P(beta'>n-m)$ converges to $P(alpha>m)P(beta'=infty)$, so we get
$$
E[alpha]=sum_m=0^infty P(alpha>m)=frac1P(beta'=infty)
$$
We already proved $E[alpha]=infty$, so this shows that $beta'$ is almost surely finite. The same goes for $alpha'$.
Finally,
$S_n$ is positive and negative infinitely often.
Just like $alpha'$ is the first time after $0$ the process is nonnegative, we define $alpha'(k)$ inductively to be the first time after $alpha'(k-1)$ that $S_nge S_alpha'(k-1)$. Now, the sequence of steps
$$
X_alpha'+1,X_alpha'(2)+1,X_alpha'(3)+1,dots
$$
are iid distributed like $X_1$. Assuming $X_1$ is nontrivial, these have a nonzero probability of being positive, so with probability $1$, infinitely many of them are positive. If $X_alpha'(k)+1$ is positive, then $S_alpha'(k)+1=S_alpha'(k)+X_alpha'(k)+1>0$. Therefore, there are infinitely many times of the form $S_alpha'(k)+1$ which are greater than $0$. The same goes for times where $S_n<0$.
$endgroup$
add a comment |
$begingroup$
$S_n=X_1+dots+X_n$. This proof will work without the assumption that $X_i$ is integer valued. Let
$$
beginarraycc
alpha=inf n>0:S_n>0 &&beta=inf n>0:S_n<0\
alpha'=inf n>0:S_nge 0 &&beta'=inf n>0:S_nle 0
endarray
$$
Using Wald's equation, you can show that
$$
E[alpha]=E[beta]=infty.
$$
Otherwise, you would have $E[S_alpha]=E[X_1]E[alpha]=0cdot E[alpha]=0$, contradicting $S_alpha>0$.
Next is the real tricky part. For each $nge 0$, define $I_n$ to be the index $iin0,1,dots,n$ for which $S_i$ is minimized, with ties going to the latest such index. I claim that
$$
P(I_n=m)=P(alpha>m)P(beta'>n-m).
$$
To see this, note that $I_n=m$ is determined by the first $n$ steps $X_1,dots,X_n$ of the process, while $alpha>m$ is determined by the first $m$ steps and $beta'>n-m$ by the first $n-m$ steps. Furthermore, $I_n=m$ occurs for $(X_1,dots,X_n)$ if and only if $alpha>m$ occurs for $(X_m,X_m-1,dots,X_1)$ (note the reversal!) and $beta'>n-m$ occurs for $(X_m+1,X_m+2,dots,X_n)$. $square$
The payoff to that tricky Lemma is we can show
$alpha'$ and $beta'$ are almost surely finite.
We start with $$
1=sum_m=0^n P(I_n=m)=sum_m=0^n P(alpha>m)P(beta'>n-m)
$$
Now, let $ntoinfty$. Each summand $ P(alpha>m)P(beta'>n-m)$ converges to $P(alpha>m)P(beta'=infty)$, so we get
$$
E[alpha]=sum_m=0^infty P(alpha>m)=frac1P(beta'=infty)
$$
We already proved $E[alpha]=infty$, so this shows that $beta'$ is almost surely finite. The same goes for $alpha'$.
Finally,
$S_n$ is positive and negative infinitely often.
Just like $alpha'$ is the first time after $0$ the process is nonnegative, we define $alpha'(k)$ inductively to be the first time after $alpha'(k-1)$ that $S_nge S_alpha'(k-1)$. Now, the sequence of steps
$$
X_alpha'+1,X_alpha'(2)+1,X_alpha'(3)+1,dots
$$
are iid distributed like $X_1$. Assuming $X_1$ is nontrivial, these have a nonzero probability of being positive, so with probability $1$, infinitely many of them are positive. If $X_alpha'(k)+1$ is positive, then $S_alpha'(k)+1=S_alpha'(k)+X_alpha'(k)+1>0$. Therefore, there are infinitely many times of the form $S_alpha'(k)+1$ which are greater than $0$. The same goes for times where $S_n<0$.
$endgroup$
add a comment |
$begingroup$
$S_n=X_1+dots+X_n$. This proof will work without the assumption that $X_i$ is integer valued. Let
$$
beginarraycc
alpha=inf n>0:S_n>0 &&beta=inf n>0:S_n<0\
alpha'=inf n>0:S_nge 0 &&beta'=inf n>0:S_nle 0
endarray
$$
Using Wald's equation, you can show that
$$
E[alpha]=E[beta]=infty.
$$
Otherwise, you would have $E[S_alpha]=E[X_1]E[alpha]=0cdot E[alpha]=0$, contradicting $S_alpha>0$.
Next is the real tricky part. For each $nge 0$, define $I_n$ to be the index $iin0,1,dots,n$ for which $S_i$ is minimized, with ties going to the latest such index. I claim that
$$
P(I_n=m)=P(alpha>m)P(beta'>n-m).
$$
To see this, note that $I_n=m$ is determined by the first $n$ steps $X_1,dots,X_n$ of the process, while $alpha>m$ is determined by the first $m$ steps and $beta'>n-m$ by the first $n-m$ steps. Furthermore, $I_n=m$ occurs for $(X_1,dots,X_n)$ if and only if $alpha>m$ occurs for $(X_m,X_m-1,dots,X_1)$ (note the reversal!) and $beta'>n-m$ occurs for $(X_m+1,X_m+2,dots,X_n)$. $square$
The payoff to that tricky Lemma is we can show
$alpha'$ and $beta'$ are almost surely finite.
We start with $$
1=sum_m=0^n P(I_n=m)=sum_m=0^n P(alpha>m)P(beta'>n-m)
$$
Now, let $ntoinfty$. Each summand $ P(alpha>m)P(beta'>n-m)$ converges to $P(alpha>m)P(beta'=infty)$, so we get
$$
E[alpha]=sum_m=0^infty P(alpha>m)=frac1P(beta'=infty)
$$
We already proved $E[alpha]=infty$, so this shows that $beta'$ is almost surely finite. The same goes for $alpha'$.
Finally,
$S_n$ is positive and negative infinitely often.
Just like $alpha'$ is the first time after $0$ the process is nonnegative, we define $alpha'(k)$ inductively to be the first time after $alpha'(k-1)$ that $S_nge S_alpha'(k-1)$. Now, the sequence of steps
$$
X_alpha'+1,X_alpha'(2)+1,X_alpha'(3)+1,dots
$$
are iid distributed like $X_1$. Assuming $X_1$ is nontrivial, these have a nonzero probability of being positive, so with probability $1$, infinitely many of them are positive. If $X_alpha'(k)+1$ is positive, then $S_alpha'(k)+1=S_alpha'(k)+X_alpha'(k)+1>0$. Therefore, there are infinitely many times of the form $S_alpha'(k)+1$ which are greater than $0$. The same goes for times where $S_n<0$.
$endgroup$
$S_n=X_1+dots+X_n$. This proof will work without the assumption that $X_i$ is integer valued. Let
$$
beginarraycc
alpha=inf n>0:S_n>0 &&beta=inf n>0:S_n<0\
alpha'=inf n>0:S_nge 0 &&beta'=inf n>0:S_nle 0
endarray
$$
Using Wald's equation, you can show that
$$
E[alpha]=E[beta]=infty.
$$
Otherwise, you would have $E[S_alpha]=E[X_1]E[alpha]=0cdot E[alpha]=0$, contradicting $S_alpha>0$.
Next is the real tricky part. For each $nge 0$, define $I_n$ to be the index $iin0,1,dots,n$ for which $S_i$ is minimized, with ties going to the latest such index. I claim that
$$
P(I_n=m)=P(alpha>m)P(beta'>n-m).
$$
To see this, note that $I_n=m$ is determined by the first $n$ steps $X_1,dots,X_n$ of the process, while $alpha>m$ is determined by the first $m$ steps and $beta'>n-m$ by the first $n-m$ steps. Furthermore, $I_n=m$ occurs for $(X_1,dots,X_n)$ if and only if $alpha>m$ occurs for $(X_m,X_m-1,dots,X_1)$ (note the reversal!) and $beta'>n-m$ occurs for $(X_m+1,X_m+2,dots,X_n)$. $square$
The payoff to that tricky Lemma is we can show
$alpha'$ and $beta'$ are almost surely finite.
We start with $$
1=sum_m=0^n P(I_n=m)=sum_m=0^n P(alpha>m)P(beta'>n-m)
$$
Now, let $ntoinfty$. Each summand $ P(alpha>m)P(beta'>n-m)$ converges to $P(alpha>m)P(beta'=infty)$, so we get
$$
E[alpha]=sum_m=0^infty P(alpha>m)=frac1P(beta'=infty)
$$
We already proved $E[alpha]=infty$, so this shows that $beta'$ is almost surely finite. The same goes for $alpha'$.
Finally,
$S_n$ is positive and negative infinitely often.
Just like $alpha'$ is the first time after $0$ the process is nonnegative, we define $alpha'(k)$ inductively to be the first time after $alpha'(k-1)$ that $S_nge S_alpha'(k-1)$. Now, the sequence of steps
$$
X_alpha'+1,X_alpha'(2)+1,X_alpha'(3)+1,dots
$$
are iid distributed like $X_1$. Assuming $X_1$ is nontrivial, these have a nonzero probability of being positive, so with probability $1$, infinitely many of them are positive. If $X_alpha'(k)+1$ is positive, then $S_alpha'(k)+1=S_alpha'(k)+X_alpha'(k)+1>0$. Therefore, there are infinitely many times of the form $S_alpha'(k)+1$ which are greater than $0$. The same goes for times where $S_n<0$.
edited Mar 20 at 18:07
cauthon14
616
616
answered Mar 20 at 17:32
Mike EarnestMike Earnest
26.3k22151
26.3k22151
add a comment |
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3154621%2fis-it-true-that-the-sum-of-zero-mean-i-i-d-random-variables-oscillates-around%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
1
$begingroup$
@Ian That would require finite variance, though, I believe.
$endgroup$
– leonbloy
Mar 19 at 21:23
2
$begingroup$
There is a proof walkthrough in Exercises 4.1.8 through 4.1.11 in Probability: Theory and Examples by Durrett.
$endgroup$
– Mike Earnest
Mar 19 at 22:28
$begingroup$
How could a positive r.v. have a zero mean?
$endgroup$
– Mostafa Ayaz
Mar 19 at 23:17
$begingroup$
@MostafaAyaz They are positive with a non-zero probability. That probability has to be less than one for it to have zero mean.
$endgroup$
– cauthon14
Mar 19 at 23:29