Is it true that the sum of zero mean i.i.d. random variables oscillates around $0$ infinitely often? The Next CEO of Stack OverflowCentral Limit Theorem and sum of squared random variablesIs Lindeberg's condition satisfied when variances of the sequence of r.v.'s are bounded from above and below?Sum of sequence of random variables infinitely often positive$S_n in [-a,a]$ for some $a$ infinitely oftenAlmost sure convergence of a martingaleAlmost sure convergence of a compound sum of random variablesSum of sequence of random variables infinitely often equals 1Sum of random variables is equal to zero infinitely oftenExpected time until random walk with positive drift crosses a positive boundaryA.S. convergence of sum of square-integrable independent random variables with summable variation

Can I board the first leg of the flight without having final country's visa?

Yu-Gi-Oh cards in Python 3

Help understanding this unsettling image of Titan, Epimetheus, and Saturn's rings?

Is it okay to majorly distort historical facts while writing a fiction story?

What steps are necessary to read a Modern SSD in Medieval Europe?

Could a dragon use its wings to swim?

The Ultimate Number Sequence Puzzle

Do scriptures give a method to recognize a truly self-realized person/jivanmukta?

How to use ReplaceAll on an expression that contains a rule

Why is information "lost" when it got into a black hole?

My ex-girlfriend uses my Apple ID to login to her iPad, do I have to give her my Apple ID password to reset it?

Purpose of level-shifter with same in and out voltages

What CSS properties can the br tag have?

Is there an equivalent of cd - for cp or mv

Spaces in which all closed sets are regular closed

Point distance program written without a framework

Players Circumventing the limitations of Wish

Why did early computer designers eschew integers?

Computationally populating tables with probability data

Physiological effects of huge anime eyes

Small nick on power cord from an electric alarm clock, and copper wiring exposed but intact

Is dried pee considered dirt?

Why am I getting "Static method cannot be referenced from a non static context: String String.valueOf(Object)"?

what's the use of '% to gdp' type of variables?



Is it true that the sum of zero mean i.i.d. random variables oscillates around $0$ infinitely often?



The Next CEO of Stack OverflowCentral Limit Theorem and sum of squared random variablesIs Lindeberg's condition satisfied when variances of the sequence of r.v.'s are bounded from above and below?Sum of sequence of random variables infinitely often positive$S_n in [-a,a]$ for some $a$ infinitely oftenAlmost sure convergence of a martingaleAlmost sure convergence of a compound sum of random variablesSum of sequence of random variables infinitely often equals 1Sum of random variables is equal to zero infinitely oftenExpected time until random walk with positive drift crosses a positive boundaryA.S. convergence of sum of square-integrable independent random variables with summable variation










6












$begingroup$


Is this true in general? (Without assuming existence of variance etc.). Only thing we know is that the r.v.s are positive with a non-zero probability and that they are integer valued, integrable with mean 0.



Only thing I could come up with was that $S_n/n$ goes to 0 (where $S_n$ is the partial sum). But this can happen even if $S_n$ was always positive.



My intuition is that this should be true because it happens with Simple Random Walks (which have a unit step size), and a larger step size should only increase the probability of it becoming negative if it was positive before. I am not sure how to formalize this. Can someone please help?










share|cite|improve this question











$endgroup$







  • 1




    $begingroup$
    @Ian That would require finite variance, though, I believe.
    $endgroup$
    – leonbloy
    Mar 19 at 21:23






  • 2




    $begingroup$
    There is a proof walkthrough in Exercises 4.1.8 through 4.1.11 in Probability: Theory and Examples by Durrett.
    $endgroup$
    – Mike Earnest
    Mar 19 at 22:28










  • $begingroup$
    How could a positive r.v. have a zero mean?
    $endgroup$
    – Mostafa Ayaz
    Mar 19 at 23:17










  • $begingroup$
    @MostafaAyaz They are positive with a non-zero probability. That probability has to be less than one for it to have zero mean.
    $endgroup$
    – cauthon14
    Mar 19 at 23:29
















6












$begingroup$


Is this true in general? (Without assuming existence of variance etc.). Only thing we know is that the r.v.s are positive with a non-zero probability and that they are integer valued, integrable with mean 0.



Only thing I could come up with was that $S_n/n$ goes to 0 (where $S_n$ is the partial sum). But this can happen even if $S_n$ was always positive.



My intuition is that this should be true because it happens with Simple Random Walks (which have a unit step size), and a larger step size should only increase the probability of it becoming negative if it was positive before. I am not sure how to formalize this. Can someone please help?










share|cite|improve this question











$endgroup$







  • 1




    $begingroup$
    @Ian That would require finite variance, though, I believe.
    $endgroup$
    – leonbloy
    Mar 19 at 21:23






  • 2




    $begingroup$
    There is a proof walkthrough in Exercises 4.1.8 through 4.1.11 in Probability: Theory and Examples by Durrett.
    $endgroup$
    – Mike Earnest
    Mar 19 at 22:28










  • $begingroup$
    How could a positive r.v. have a zero mean?
    $endgroup$
    – Mostafa Ayaz
    Mar 19 at 23:17










  • $begingroup$
    @MostafaAyaz They are positive with a non-zero probability. That probability has to be less than one for it to have zero mean.
    $endgroup$
    – cauthon14
    Mar 19 at 23:29














6












6








6


2



$begingroup$


Is this true in general? (Without assuming existence of variance etc.). Only thing we know is that the r.v.s are positive with a non-zero probability and that they are integer valued, integrable with mean 0.



Only thing I could come up with was that $S_n/n$ goes to 0 (where $S_n$ is the partial sum). But this can happen even if $S_n$ was always positive.



My intuition is that this should be true because it happens with Simple Random Walks (which have a unit step size), and a larger step size should only increase the probability of it becoming negative if it was positive before. I am not sure how to formalize this. Can someone please help?










share|cite|improve this question











$endgroup$




Is this true in general? (Without assuming existence of variance etc.). Only thing we know is that the r.v.s are positive with a non-zero probability and that they are integer valued, integrable with mean 0.



Only thing I could come up with was that $S_n/n$ goes to 0 (where $S_n$ is the partial sum). But this can happen even if $S_n$ was always positive.



My intuition is that this should be true because it happens with Simple Random Walks (which have a unit step size), and a larger step size should only increase the probability of it becoming negative if it was positive before. I am not sure how to formalize this. Can someone please help?







probability-theory






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Mar 19 at 21:28









leonbloy

42k647109




42k647109










asked Mar 19 at 21:17









cauthon14cauthon14

616




616







  • 1




    $begingroup$
    @Ian That would require finite variance, though, I believe.
    $endgroup$
    – leonbloy
    Mar 19 at 21:23






  • 2




    $begingroup$
    There is a proof walkthrough in Exercises 4.1.8 through 4.1.11 in Probability: Theory and Examples by Durrett.
    $endgroup$
    – Mike Earnest
    Mar 19 at 22:28










  • $begingroup$
    How could a positive r.v. have a zero mean?
    $endgroup$
    – Mostafa Ayaz
    Mar 19 at 23:17










  • $begingroup$
    @MostafaAyaz They are positive with a non-zero probability. That probability has to be less than one for it to have zero mean.
    $endgroup$
    – cauthon14
    Mar 19 at 23:29













  • 1




    $begingroup$
    @Ian That would require finite variance, though, I believe.
    $endgroup$
    – leonbloy
    Mar 19 at 21:23






  • 2




    $begingroup$
    There is a proof walkthrough in Exercises 4.1.8 through 4.1.11 in Probability: Theory and Examples by Durrett.
    $endgroup$
    – Mike Earnest
    Mar 19 at 22:28










  • $begingroup$
    How could a positive r.v. have a zero mean?
    $endgroup$
    – Mostafa Ayaz
    Mar 19 at 23:17










  • $begingroup$
    @MostafaAyaz They are positive with a non-zero probability. That probability has to be less than one for it to have zero mean.
    $endgroup$
    – cauthon14
    Mar 19 at 23:29








1




1




$begingroup$
@Ian That would require finite variance, though, I believe.
$endgroup$
– leonbloy
Mar 19 at 21:23




$begingroup$
@Ian That would require finite variance, though, I believe.
$endgroup$
– leonbloy
Mar 19 at 21:23




2




2




$begingroup$
There is a proof walkthrough in Exercises 4.1.8 through 4.1.11 in Probability: Theory and Examples by Durrett.
$endgroup$
– Mike Earnest
Mar 19 at 22:28




$begingroup$
There is a proof walkthrough in Exercises 4.1.8 through 4.1.11 in Probability: Theory and Examples by Durrett.
$endgroup$
– Mike Earnest
Mar 19 at 22:28












$begingroup$
How could a positive r.v. have a zero mean?
$endgroup$
– Mostafa Ayaz
Mar 19 at 23:17




$begingroup$
How could a positive r.v. have a zero mean?
$endgroup$
– Mostafa Ayaz
Mar 19 at 23:17












$begingroup$
@MostafaAyaz They are positive with a non-zero probability. That probability has to be less than one for it to have zero mean.
$endgroup$
– cauthon14
Mar 19 at 23:29





$begingroup$
@MostafaAyaz They are positive with a non-zero probability. That probability has to be less than one for it to have zero mean.
$endgroup$
– cauthon14
Mar 19 at 23:29











1 Answer
1






active

oldest

votes


















2












$begingroup$

$S_n=X_1+dots+X_n$. This proof will work without the assumption that $X_i$ is integer valued. Let
$$
beginarraycc
alpha=inf n>0:S_n>0 &&beta=inf n>0:S_n<0\
alpha'=inf n>0:S_nge 0 &&beta'=inf n>0:S_nle 0
endarray
$$



Using Wald's equation, you can show that
$$
E[alpha]=E[beta]=infty.
$$

Otherwise, you would have $E[S_alpha]=E[X_1]E[alpha]=0cdot E[alpha]=0$, contradicting $S_alpha>0$.



Next is the real tricky part. For each $nge 0$, define $I_n$ to be the index $iin0,1,dots,n$ for which $S_i$ is minimized, with ties going to the latest such index. I claim that




$$
P(I_n=m)=P(alpha>m)P(beta'>n-m).
$$




To see this, note that $I_n=m$ is determined by the first $n$ steps $X_1,dots,X_n$ of the process, while $alpha>m$ is determined by the first $m$ steps and $beta'>n-m$ by the first $n-m$ steps. Furthermore, $I_n=m$ occurs for $(X_1,dots,X_n)$ if and only if $alpha>m$ occurs for $(X_m,X_m-1,dots,X_1)$ (note the reversal!) and $beta'>n-m$ occurs for $(X_m+1,X_m+2,dots,X_n)$. $square$



The payoff to that tricky Lemma is we can show




$alpha'$ and $beta'$ are almost surely finite.




We start with $$
1=sum_m=0^n P(I_n=m)=sum_m=0^n P(alpha>m)P(beta'>n-m)
$$

Now, let $ntoinfty$. Each summand $ P(alpha>m)P(beta'>n-m)$ converges to $P(alpha>m)P(beta'=infty)$, so we get
$$
E[alpha]=sum_m=0^infty P(alpha>m)=frac1P(beta'=infty)
$$

We already proved $E[alpha]=infty$, so this shows that $beta'$ is almost surely finite. The same goes for $alpha'$.



Finally,




$S_n$ is positive and negative infinitely often.




Just like $alpha'$ is the first time after $0$ the process is nonnegative, we define $alpha'(k)$ inductively to be the first time after $alpha'(k-1)$ that $S_nge S_alpha'(k-1)$. Now, the sequence of steps
$$
X_alpha'+1,X_alpha'(2)+1,X_alpha'(3)+1,dots
$$

are iid distributed like $X_1$. Assuming $X_1$ is nontrivial, these have a nonzero probability of being positive, so with probability $1$, infinitely many of them are positive. If $X_alpha'(k)+1$ is positive, then $S_alpha'(k)+1=S_alpha'(k)+X_alpha'(k)+1>0$. Therefore, there are infinitely many times of the form $S_alpha'(k)+1$ which are greater than $0$. The same goes for times where $S_n<0$.






share|cite|improve this answer











$endgroup$













    Your Answer





    StackExchange.ifUsing("editor", function ()
    return StackExchange.using("mathjaxEditing", function ()
    StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
    StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
    );
    );
    , "mathjax-editing");

    StackExchange.ready(function()
    var channelOptions =
    tags: "".split(" "),
    id: "69"
    ;
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function()
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled)
    StackExchange.using("snippets", function()
    createEditor();
    );

    else
    createEditor();

    );

    function createEditor()
    StackExchange.prepareEditor(
    heartbeatType: 'answer',
    autoActivateHeartbeat: false,
    convertImagesToLinks: true,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    imageUploader:
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    ,
    noCode: true, onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    );



    );













    draft saved

    draft discarded


















    StackExchange.ready(
    function ()
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3154621%2fis-it-true-that-the-sum-of-zero-mean-i-i-d-random-variables-oscillates-around%23new-answer', 'question_page');

    );

    Post as a guest















    Required, but never shown

























    1 Answer
    1






    active

    oldest

    votes








    1 Answer
    1






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    2












    $begingroup$

    $S_n=X_1+dots+X_n$. This proof will work without the assumption that $X_i$ is integer valued. Let
    $$
    beginarraycc
    alpha=inf n>0:S_n>0 &&beta=inf n>0:S_n<0\
    alpha'=inf n>0:S_nge 0 &&beta'=inf n>0:S_nle 0
    endarray
    $$



    Using Wald's equation, you can show that
    $$
    E[alpha]=E[beta]=infty.
    $$

    Otherwise, you would have $E[S_alpha]=E[X_1]E[alpha]=0cdot E[alpha]=0$, contradicting $S_alpha>0$.



    Next is the real tricky part. For each $nge 0$, define $I_n$ to be the index $iin0,1,dots,n$ for which $S_i$ is minimized, with ties going to the latest such index. I claim that




    $$
    P(I_n=m)=P(alpha>m)P(beta'>n-m).
    $$




    To see this, note that $I_n=m$ is determined by the first $n$ steps $X_1,dots,X_n$ of the process, while $alpha>m$ is determined by the first $m$ steps and $beta'>n-m$ by the first $n-m$ steps. Furthermore, $I_n=m$ occurs for $(X_1,dots,X_n)$ if and only if $alpha>m$ occurs for $(X_m,X_m-1,dots,X_1)$ (note the reversal!) and $beta'>n-m$ occurs for $(X_m+1,X_m+2,dots,X_n)$. $square$



    The payoff to that tricky Lemma is we can show




    $alpha'$ and $beta'$ are almost surely finite.




    We start with $$
    1=sum_m=0^n P(I_n=m)=sum_m=0^n P(alpha>m)P(beta'>n-m)
    $$

    Now, let $ntoinfty$. Each summand $ P(alpha>m)P(beta'>n-m)$ converges to $P(alpha>m)P(beta'=infty)$, so we get
    $$
    E[alpha]=sum_m=0^infty P(alpha>m)=frac1P(beta'=infty)
    $$

    We already proved $E[alpha]=infty$, so this shows that $beta'$ is almost surely finite. The same goes for $alpha'$.



    Finally,




    $S_n$ is positive and negative infinitely often.




    Just like $alpha'$ is the first time after $0$ the process is nonnegative, we define $alpha'(k)$ inductively to be the first time after $alpha'(k-1)$ that $S_nge S_alpha'(k-1)$. Now, the sequence of steps
    $$
    X_alpha'+1,X_alpha'(2)+1,X_alpha'(3)+1,dots
    $$

    are iid distributed like $X_1$. Assuming $X_1$ is nontrivial, these have a nonzero probability of being positive, so with probability $1$, infinitely many of them are positive. If $X_alpha'(k)+1$ is positive, then $S_alpha'(k)+1=S_alpha'(k)+X_alpha'(k)+1>0$. Therefore, there are infinitely many times of the form $S_alpha'(k)+1$ which are greater than $0$. The same goes for times where $S_n<0$.






    share|cite|improve this answer











    $endgroup$

















      2












      $begingroup$

      $S_n=X_1+dots+X_n$. This proof will work without the assumption that $X_i$ is integer valued. Let
      $$
      beginarraycc
      alpha=inf n>0:S_n>0 &&beta=inf n>0:S_n<0\
      alpha'=inf n>0:S_nge 0 &&beta'=inf n>0:S_nle 0
      endarray
      $$



      Using Wald's equation, you can show that
      $$
      E[alpha]=E[beta]=infty.
      $$

      Otherwise, you would have $E[S_alpha]=E[X_1]E[alpha]=0cdot E[alpha]=0$, contradicting $S_alpha>0$.



      Next is the real tricky part. For each $nge 0$, define $I_n$ to be the index $iin0,1,dots,n$ for which $S_i$ is minimized, with ties going to the latest such index. I claim that




      $$
      P(I_n=m)=P(alpha>m)P(beta'>n-m).
      $$




      To see this, note that $I_n=m$ is determined by the first $n$ steps $X_1,dots,X_n$ of the process, while $alpha>m$ is determined by the first $m$ steps and $beta'>n-m$ by the first $n-m$ steps. Furthermore, $I_n=m$ occurs for $(X_1,dots,X_n)$ if and only if $alpha>m$ occurs for $(X_m,X_m-1,dots,X_1)$ (note the reversal!) and $beta'>n-m$ occurs for $(X_m+1,X_m+2,dots,X_n)$. $square$



      The payoff to that tricky Lemma is we can show




      $alpha'$ and $beta'$ are almost surely finite.




      We start with $$
      1=sum_m=0^n P(I_n=m)=sum_m=0^n P(alpha>m)P(beta'>n-m)
      $$

      Now, let $ntoinfty$. Each summand $ P(alpha>m)P(beta'>n-m)$ converges to $P(alpha>m)P(beta'=infty)$, so we get
      $$
      E[alpha]=sum_m=0^infty P(alpha>m)=frac1P(beta'=infty)
      $$

      We already proved $E[alpha]=infty$, so this shows that $beta'$ is almost surely finite. The same goes for $alpha'$.



      Finally,




      $S_n$ is positive and negative infinitely often.




      Just like $alpha'$ is the first time after $0$ the process is nonnegative, we define $alpha'(k)$ inductively to be the first time after $alpha'(k-1)$ that $S_nge S_alpha'(k-1)$. Now, the sequence of steps
      $$
      X_alpha'+1,X_alpha'(2)+1,X_alpha'(3)+1,dots
      $$

      are iid distributed like $X_1$. Assuming $X_1$ is nontrivial, these have a nonzero probability of being positive, so with probability $1$, infinitely many of them are positive. If $X_alpha'(k)+1$ is positive, then $S_alpha'(k)+1=S_alpha'(k)+X_alpha'(k)+1>0$. Therefore, there are infinitely many times of the form $S_alpha'(k)+1$ which are greater than $0$. The same goes for times where $S_n<0$.






      share|cite|improve this answer











      $endgroup$















        2












        2








        2





        $begingroup$

        $S_n=X_1+dots+X_n$. This proof will work without the assumption that $X_i$ is integer valued. Let
        $$
        beginarraycc
        alpha=inf n>0:S_n>0 &&beta=inf n>0:S_n<0\
        alpha'=inf n>0:S_nge 0 &&beta'=inf n>0:S_nle 0
        endarray
        $$



        Using Wald's equation, you can show that
        $$
        E[alpha]=E[beta]=infty.
        $$

        Otherwise, you would have $E[S_alpha]=E[X_1]E[alpha]=0cdot E[alpha]=0$, contradicting $S_alpha>0$.



        Next is the real tricky part. For each $nge 0$, define $I_n$ to be the index $iin0,1,dots,n$ for which $S_i$ is minimized, with ties going to the latest such index. I claim that




        $$
        P(I_n=m)=P(alpha>m)P(beta'>n-m).
        $$




        To see this, note that $I_n=m$ is determined by the first $n$ steps $X_1,dots,X_n$ of the process, while $alpha>m$ is determined by the first $m$ steps and $beta'>n-m$ by the first $n-m$ steps. Furthermore, $I_n=m$ occurs for $(X_1,dots,X_n)$ if and only if $alpha>m$ occurs for $(X_m,X_m-1,dots,X_1)$ (note the reversal!) and $beta'>n-m$ occurs for $(X_m+1,X_m+2,dots,X_n)$. $square$



        The payoff to that tricky Lemma is we can show




        $alpha'$ and $beta'$ are almost surely finite.




        We start with $$
        1=sum_m=0^n P(I_n=m)=sum_m=0^n P(alpha>m)P(beta'>n-m)
        $$

        Now, let $ntoinfty$. Each summand $ P(alpha>m)P(beta'>n-m)$ converges to $P(alpha>m)P(beta'=infty)$, so we get
        $$
        E[alpha]=sum_m=0^infty P(alpha>m)=frac1P(beta'=infty)
        $$

        We already proved $E[alpha]=infty$, so this shows that $beta'$ is almost surely finite. The same goes for $alpha'$.



        Finally,




        $S_n$ is positive and negative infinitely often.




        Just like $alpha'$ is the first time after $0$ the process is nonnegative, we define $alpha'(k)$ inductively to be the first time after $alpha'(k-1)$ that $S_nge S_alpha'(k-1)$. Now, the sequence of steps
        $$
        X_alpha'+1,X_alpha'(2)+1,X_alpha'(3)+1,dots
        $$

        are iid distributed like $X_1$. Assuming $X_1$ is nontrivial, these have a nonzero probability of being positive, so with probability $1$, infinitely many of them are positive. If $X_alpha'(k)+1$ is positive, then $S_alpha'(k)+1=S_alpha'(k)+X_alpha'(k)+1>0$. Therefore, there are infinitely many times of the form $S_alpha'(k)+1$ which are greater than $0$. The same goes for times where $S_n<0$.






        share|cite|improve this answer











        $endgroup$



        $S_n=X_1+dots+X_n$. This proof will work without the assumption that $X_i$ is integer valued. Let
        $$
        beginarraycc
        alpha=inf n>0:S_n>0 &&beta=inf n>0:S_n<0\
        alpha'=inf n>0:S_nge 0 &&beta'=inf n>0:S_nle 0
        endarray
        $$



        Using Wald's equation, you can show that
        $$
        E[alpha]=E[beta]=infty.
        $$

        Otherwise, you would have $E[S_alpha]=E[X_1]E[alpha]=0cdot E[alpha]=0$, contradicting $S_alpha>0$.



        Next is the real tricky part. For each $nge 0$, define $I_n$ to be the index $iin0,1,dots,n$ for which $S_i$ is minimized, with ties going to the latest such index. I claim that




        $$
        P(I_n=m)=P(alpha>m)P(beta'>n-m).
        $$




        To see this, note that $I_n=m$ is determined by the first $n$ steps $X_1,dots,X_n$ of the process, while $alpha>m$ is determined by the first $m$ steps and $beta'>n-m$ by the first $n-m$ steps. Furthermore, $I_n=m$ occurs for $(X_1,dots,X_n)$ if and only if $alpha>m$ occurs for $(X_m,X_m-1,dots,X_1)$ (note the reversal!) and $beta'>n-m$ occurs for $(X_m+1,X_m+2,dots,X_n)$. $square$



        The payoff to that tricky Lemma is we can show




        $alpha'$ and $beta'$ are almost surely finite.




        We start with $$
        1=sum_m=0^n P(I_n=m)=sum_m=0^n P(alpha>m)P(beta'>n-m)
        $$

        Now, let $ntoinfty$. Each summand $ P(alpha>m)P(beta'>n-m)$ converges to $P(alpha>m)P(beta'=infty)$, so we get
        $$
        E[alpha]=sum_m=0^infty P(alpha>m)=frac1P(beta'=infty)
        $$

        We already proved $E[alpha]=infty$, so this shows that $beta'$ is almost surely finite. The same goes for $alpha'$.



        Finally,




        $S_n$ is positive and negative infinitely often.




        Just like $alpha'$ is the first time after $0$ the process is nonnegative, we define $alpha'(k)$ inductively to be the first time after $alpha'(k-1)$ that $S_nge S_alpha'(k-1)$. Now, the sequence of steps
        $$
        X_alpha'+1,X_alpha'(2)+1,X_alpha'(3)+1,dots
        $$

        are iid distributed like $X_1$. Assuming $X_1$ is nontrivial, these have a nonzero probability of being positive, so with probability $1$, infinitely many of them are positive. If $X_alpha'(k)+1$ is positive, then $S_alpha'(k)+1=S_alpha'(k)+X_alpha'(k)+1>0$. Therefore, there are infinitely many times of the form $S_alpha'(k)+1$ which are greater than $0$. The same goes for times where $S_n<0$.







        share|cite|improve this answer














        share|cite|improve this answer



        share|cite|improve this answer








        edited Mar 20 at 18:07









        cauthon14

        616




        616










        answered Mar 20 at 17:32









        Mike EarnestMike Earnest

        26.3k22151




        26.3k22151



























            draft saved

            draft discarded
















































            Thanks for contributing an answer to Mathematics Stack Exchange!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid


            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.

            Use MathJax to format equations. MathJax reference.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function ()
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3154621%2fis-it-true-that-the-sum-of-zero-mean-i-i-d-random-variables-oscillates-around%23new-answer', 'question_page');

            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            Lowndes Grove History Architecture References Navigation menu32°48′6″N 79°57′58″W / 32.80167°N 79.96611°W / 32.80167; -79.9661132°48′6″N 79°57′58″W / 32.80167°N 79.96611°W / 32.80167; -79.9661178002500"National Register Information System"Historic houses of South Carolina"Lowndes Grove""+32° 48' 6.00", −79° 57' 58.00""Lowndes Grove, Charleston County (260 St. Margaret St., Charleston)""Lowndes Grove"The Charleston ExpositionIt Happened in South Carolina"Lowndes Grove (House), Saint Margaret Street & Sixth Avenue, Charleston, Charleston County, SC(Photographs)"Plantations of the Carolina Low Countrye

            random experiment with two different functions on unit interval Announcing the arrival of Valued Associate #679: Cesar Manara Planned maintenance scheduled April 23, 2019 at 00:00UTC (8:00pm US/Eastern)Random variable and probability space notionsRandom Walk with EdgesFinding functions where the increase over a random interval is Poisson distributedNumber of days until dayCan an observed event in fact be of zero probability?Unit random processmodels of coins and uniform distributionHow to get the number of successes given $n$ trials , probability $P$ and a random variable $X$Absorbing Markov chain in a computer. Is “almost every” turned into always convergence in computer executions?Stopped random walk is not uniformly integrable

            How should I support this large drywall patch? Planned maintenance scheduled April 23, 2019 at 00:00UTC (8:00pm US/Eastern) Announcing the arrival of Valued Associate #679: Cesar Manara Unicorn Meta Zoo #1: Why another podcast?How do I cover large gaps in drywall?How do I keep drywall around a patch from crumbling?Can I glue a second layer of drywall?How to patch long strip on drywall?Large drywall patch: how to avoid bulging seams?Drywall Mesh Patch vs. Bulge? To remove or not to remove?How to fix this drywall job?Prep drywall before backsplashWhat's the best way to fix this horrible drywall patch job?Drywall patching using 3M Patch Plus Primer