How can the negative binomial distribution be derived from another more “elementary” distribution?Geometric Distribution versus Negative Binomial DistributionNegative binomial distribution - deriving of the p.m.f. combinatoriallyNegative Binomial Distribution.Elementary proof of geometric / negative binomial distribution in birth-death processesBounds-negative binomial distributionNegative binomial distribution pmf derivativeUnconditional distribution of a negative binomial with poisson meanNon integer successes in negative binomial distribution.Why are there different forms of the negative binomial distribution?Confusion about Negative binomial distribution.

Is this toilet slogan correct usage of the English language?

What should you do if you miss a job interview (deliberately)?

Recommended PCB layout understanding - ADM2572 datasheet

Temporarily disable WLAN internet access for children, but allow it for adults

Mimic lecturing on blackboard, facing audience

What exact color does ozone gas have?

Has any country ever had 2 former presidents in jail simultaneously?

Picking the different solutions to the time independent Schrodinger eqaution

How does a computer interpret real numbers?

What is going on with 'gets(stdin)' on the site coderbyte?

What does "Scientists rise up against statistical significance" mean? (Comment in Nature)

Is there an injective, monotonically increasing, strictly concave function from the reals, to the reals?

What is Cash Advance APR?

What is the evidence for the "tyranny of the majority problem" in a direct democracy context?

Is there a way to get `mathscr' with lower case letters in pdfLaTeX?

Extract more than nine arguments that occur periodically in a sentence to use in macros in order to typset

Why is the "ls" command showing permissions of files in a FAT32 partition?

Multiplicative persistence

The IT department bottlenecks progress. How should I handle this?

Does an advisor owe his/her student anything? Will an advisor keep a PhD student only out of pity?

Why Shazam when there is already Superman?

Does the Linux kernel need a file system to run?

Unexpected behavior of the procedure `Area` on the object 'Polygon'

Does IPv6 have similar concept of network mask?



How can the negative binomial distribution be derived from another more “elementary” distribution?


Geometric Distribution versus Negative Binomial DistributionNegative binomial distribution - deriving of the p.m.f. combinatoriallyNegative Binomial Distribution.Elementary proof of geometric / negative binomial distribution in birth-death processesBounds-negative binomial distributionNegative binomial distribution pmf derivativeUnconditional distribution of a negative binomial with poisson meanNon integer successes in negative binomial distribution.Why are there different forms of the negative binomial distribution?Confusion about Negative binomial distribution.













0












$begingroup$


I am looking at the negative binomial distribution for the case where $p$ corresponds to "success probability" and $r$ is the integer number of "failures". In this case we have $$E(X)=fracrp1−p$$ $$textVar(X)=fracrp(1−p)^2$$



I thought it might be derived from the geometric distribution but the geometric distribution is derived from the negative binomial distribution, and not the other way round?



Please explain and a source that I can use would also be great for this.










share|cite|improve this question











$endgroup$
















    0












    $begingroup$


    I am looking at the negative binomial distribution for the case where $p$ corresponds to "success probability" and $r$ is the integer number of "failures". In this case we have $$E(X)=fracrp1−p$$ $$textVar(X)=fracrp(1−p)^2$$



    I thought it might be derived from the geometric distribution but the geometric distribution is derived from the negative binomial distribution, and not the other way round?



    Please explain and a source that I can use would also be great for this.










    share|cite|improve this question











    $endgroup$














      0












      0








      0


      0



      $begingroup$


      I am looking at the negative binomial distribution for the case where $p$ corresponds to "success probability" and $r$ is the integer number of "failures". In this case we have $$E(X)=fracrp1−p$$ $$textVar(X)=fracrp(1−p)^2$$



      I thought it might be derived from the geometric distribution but the geometric distribution is derived from the negative binomial distribution, and not the other way round?



      Please explain and a source that I can use would also be great for this.










      share|cite|improve this question











      $endgroup$




      I am looking at the negative binomial distribution for the case where $p$ corresponds to "success probability" and $r$ is the integer number of "failures". In this case we have $$E(X)=fracrp1−p$$ $$textVar(X)=fracrp(1−p)^2$$



      I thought it might be derived from the geometric distribution but the geometric distribution is derived from the negative binomial distribution, and not the other way round?



      Please explain and a source that I can use would also be great for this.







      probability probability-theory probability-distributions negative-binomial






      share|cite|improve this question















      share|cite|improve this question













      share|cite|improve this question




      share|cite|improve this question








      edited Dec 24 '16 at 15:40









      Theoretical Economist

      3,7702831




      3,7702831










      asked Dec 24 '16 at 15:34









      Hiboa4Hiboa4

      61




      61




















          2 Answers
          2






          active

          oldest

          votes


















          0












          $begingroup$

          You look at a sequence of independent Bernoulli trials $B_i$ where $B_isim Bernoulli(1-p)$. So $B_i=1$ denotes success and $B_i=0$ denotes failure.



          Let $N$ be the number of successes needed to get the first failure. So if $N=k$, ($kgeq0$) then you have $k$ successes before you get the first failure at $(k+1)$-th trial. In other words, if $N=k$ then you get to observe $(B_1=1,B_2=1,...,B_k=1,B_k+1=0)$. Then $Nsim Geometric(p)$.



          Now let $N_r$ be the number of Bernoulli successes you observe before getting the $r$-th failure. Then you can see that you have to observe a random number of successes before getting the first failure, then a random number of successes before getting the second failure, and so on, till a random number of successes before getting the $r$-th failure. Each such random number of successes has $Geometric(p)$ distribution. So $N_r$ is the sum of $r$ many $Geometric(p)$ random variables. This $N_r$ has the $Negative$ $Binomial$ distribution.



          So it is the other way round: the Geometric distribution gives rise to the Negative Binomial distribution.



          Expectation of the Negative Binomial distribution is just the sum of expectations of $r$ many Geometric($p$) random variables. Each has expectation $dfracp1-p$, so our Negative Binomial has expectation $dfracrp1-p$.



          Since the Geometric random variables are independent, variance of Negative Binomial is sum of variances of $r$ many Geometric($p$) random variables. A $Geometric(p)$ r.v. has variance $dfracp(1-p)^2$, so our Negative Binomial has variance $dfracrp(1-p)^2$.






          share|cite|improve this answer









          $endgroup$




















            0












            $begingroup$

            The (0 based) geometric distribution is that of the count of failures before the first success in an indefinite sequence of independent Bernoulli trials with identical success rate.



            A negative binomial distribution is that of the count of successes before a specified number of failures occurs in an indefinite sequence of independent Bernoulli trials with identical success rate.



            These definitions are clearly inter related.   You can derive one from the other, or both together from first principles.



            It all depends on what seed you have been given.




            Let $X_i$ be a geometric random variable with success rate, $1-p$.   Then by applying the above definition it is apparent that $X_i$ has a negative binomial distribution the count of 'successes' before 1 'failure', with 'failure' rate $1-p$.



            $$X_isimmathcal Geo_0(1-p) iff X_i~sim~mathcalNegBin(1, p)$$



            So if you are given the probability mass function, expectation, and variance, for a general negative binomial, you can immediately find the probability mass function, expectation, and variance for a geometric random variable.




            Let $Y_r$ be a negative binomial random variable with success rate, $p$, and specified number of successes $r$.   Then $Y_r$ is the sum of $r$ independent geometric distributions with identical success rate $1-p$.   (Can you see why?)



            $$Y_rsimmathcalNegBin(r, p)~iff~ Y_r=sum_i=1^r X_i~wedge~ bigl(X_ibigr)_i=1^roversetrm iidsimmathcalGeo_0(1-p)$$



            So if you have been given the pmf for a geometric distribution, you can obtain the general pmf, expectation, and variance, of a negative binomial distribution, with just a little work.




            So if you start with $mathsf E(X_1)=p(1-p)^-1, mathsf Var(X_1)=p(1-p)^-2$ because, $X_1simmathcalGeo_0(1-p)$ then...




            $$beginalignmathsf E(Y_r) ~&=~ sum_i=1^rmathsf E(X_i) \[1ex] &=~ rmathsf E(X_1) \[1ex] ~&=~ rp(1-p)^-1\[2ex]mathsf Var(Y_r) ~&=~ sum_i=1^rmathsf Var(X_i)+2sum_1leq i<jleq rmathsfCov(X_i,X_j)\[1ex] &=~ rmathsfVar(X_1) \[1ex] &=~ rp(1-p)^-2endalign$$







            share|cite|improve this answer











            $endgroup$












            • $begingroup$
              Thanks for this explanation it makes a lot of sense. I am going to use part of it as an explanation in a uni report. Is citing this an answer in this forum an acceptable reference or do I need to use a book/journal? If so do you have a suggestion of where I can find a similar explanation in literature?
              $endgroup$
              – Hiboa4
              Dec 24 '16 at 22:22










            Your Answer





            StackExchange.ifUsing("editor", function ()
            return StackExchange.using("mathjaxEditing", function ()
            StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
            StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
            );
            );
            , "mathjax-editing");

            StackExchange.ready(function()
            var channelOptions =
            tags: "".split(" "),
            id: "69"
            ;
            initTagRenderer("".split(" "), "".split(" "), channelOptions);

            StackExchange.using("externalEditor", function()
            // Have to fire editor after snippets, if snippets enabled
            if (StackExchange.settings.snippets.snippetsEnabled)
            StackExchange.using("snippets", function()
            createEditor();
            );

            else
            createEditor();

            );

            function createEditor()
            StackExchange.prepareEditor(
            heartbeatType: 'answer',
            autoActivateHeartbeat: false,
            convertImagesToLinks: true,
            noModals: true,
            showLowRepImageUploadWarning: true,
            reputationToPostImages: 10,
            bindNavPrevention: true,
            postfix: "",
            imageUploader:
            brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
            contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
            allowUrls: true
            ,
            noCode: true, onDemand: true,
            discardSelector: ".discard-answer"
            ,immediatelyShowMarkdownHelp:true
            );



            );













            draft saved

            draft discarded


















            StackExchange.ready(
            function ()
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2070620%2fhow-can-the-negative-binomial-distribution-be-derived-from-another-more-element%23new-answer', 'question_page');

            );

            Post as a guest















            Required, but never shown

























            2 Answers
            2






            active

            oldest

            votes








            2 Answers
            2






            active

            oldest

            votes









            active

            oldest

            votes






            active

            oldest

            votes









            0












            $begingroup$

            You look at a sequence of independent Bernoulli trials $B_i$ where $B_isim Bernoulli(1-p)$. So $B_i=1$ denotes success and $B_i=0$ denotes failure.



            Let $N$ be the number of successes needed to get the first failure. So if $N=k$, ($kgeq0$) then you have $k$ successes before you get the first failure at $(k+1)$-th trial. In other words, if $N=k$ then you get to observe $(B_1=1,B_2=1,...,B_k=1,B_k+1=0)$. Then $Nsim Geometric(p)$.



            Now let $N_r$ be the number of Bernoulli successes you observe before getting the $r$-th failure. Then you can see that you have to observe a random number of successes before getting the first failure, then a random number of successes before getting the second failure, and so on, till a random number of successes before getting the $r$-th failure. Each such random number of successes has $Geometric(p)$ distribution. So $N_r$ is the sum of $r$ many $Geometric(p)$ random variables. This $N_r$ has the $Negative$ $Binomial$ distribution.



            So it is the other way round: the Geometric distribution gives rise to the Negative Binomial distribution.



            Expectation of the Negative Binomial distribution is just the sum of expectations of $r$ many Geometric($p$) random variables. Each has expectation $dfracp1-p$, so our Negative Binomial has expectation $dfracrp1-p$.



            Since the Geometric random variables are independent, variance of Negative Binomial is sum of variances of $r$ many Geometric($p$) random variables. A $Geometric(p)$ r.v. has variance $dfracp(1-p)^2$, so our Negative Binomial has variance $dfracrp(1-p)^2$.






            share|cite|improve this answer









            $endgroup$

















              0












              $begingroup$

              You look at a sequence of independent Bernoulli trials $B_i$ where $B_isim Bernoulli(1-p)$. So $B_i=1$ denotes success and $B_i=0$ denotes failure.



              Let $N$ be the number of successes needed to get the first failure. So if $N=k$, ($kgeq0$) then you have $k$ successes before you get the first failure at $(k+1)$-th trial. In other words, if $N=k$ then you get to observe $(B_1=1,B_2=1,...,B_k=1,B_k+1=0)$. Then $Nsim Geometric(p)$.



              Now let $N_r$ be the number of Bernoulli successes you observe before getting the $r$-th failure. Then you can see that you have to observe a random number of successes before getting the first failure, then a random number of successes before getting the second failure, and so on, till a random number of successes before getting the $r$-th failure. Each such random number of successes has $Geometric(p)$ distribution. So $N_r$ is the sum of $r$ many $Geometric(p)$ random variables. This $N_r$ has the $Negative$ $Binomial$ distribution.



              So it is the other way round: the Geometric distribution gives rise to the Negative Binomial distribution.



              Expectation of the Negative Binomial distribution is just the sum of expectations of $r$ many Geometric($p$) random variables. Each has expectation $dfracp1-p$, so our Negative Binomial has expectation $dfracrp1-p$.



              Since the Geometric random variables are independent, variance of Negative Binomial is sum of variances of $r$ many Geometric($p$) random variables. A $Geometric(p)$ r.v. has variance $dfracp(1-p)^2$, so our Negative Binomial has variance $dfracrp(1-p)^2$.






              share|cite|improve this answer









              $endgroup$















                0












                0








                0





                $begingroup$

                You look at a sequence of independent Bernoulli trials $B_i$ where $B_isim Bernoulli(1-p)$. So $B_i=1$ denotes success and $B_i=0$ denotes failure.



                Let $N$ be the number of successes needed to get the first failure. So if $N=k$, ($kgeq0$) then you have $k$ successes before you get the first failure at $(k+1)$-th trial. In other words, if $N=k$ then you get to observe $(B_1=1,B_2=1,...,B_k=1,B_k+1=0)$. Then $Nsim Geometric(p)$.



                Now let $N_r$ be the number of Bernoulli successes you observe before getting the $r$-th failure. Then you can see that you have to observe a random number of successes before getting the first failure, then a random number of successes before getting the second failure, and so on, till a random number of successes before getting the $r$-th failure. Each such random number of successes has $Geometric(p)$ distribution. So $N_r$ is the sum of $r$ many $Geometric(p)$ random variables. This $N_r$ has the $Negative$ $Binomial$ distribution.



                So it is the other way round: the Geometric distribution gives rise to the Negative Binomial distribution.



                Expectation of the Negative Binomial distribution is just the sum of expectations of $r$ many Geometric($p$) random variables. Each has expectation $dfracp1-p$, so our Negative Binomial has expectation $dfracrp1-p$.



                Since the Geometric random variables are independent, variance of Negative Binomial is sum of variances of $r$ many Geometric($p$) random variables. A $Geometric(p)$ r.v. has variance $dfracp(1-p)^2$, so our Negative Binomial has variance $dfracrp(1-p)^2$.






                share|cite|improve this answer









                $endgroup$



                You look at a sequence of independent Bernoulli trials $B_i$ where $B_isim Bernoulli(1-p)$. So $B_i=1$ denotes success and $B_i=0$ denotes failure.



                Let $N$ be the number of successes needed to get the first failure. So if $N=k$, ($kgeq0$) then you have $k$ successes before you get the first failure at $(k+1)$-th trial. In other words, if $N=k$ then you get to observe $(B_1=1,B_2=1,...,B_k=1,B_k+1=0)$. Then $Nsim Geometric(p)$.



                Now let $N_r$ be the number of Bernoulli successes you observe before getting the $r$-th failure. Then you can see that you have to observe a random number of successes before getting the first failure, then a random number of successes before getting the second failure, and so on, till a random number of successes before getting the $r$-th failure. Each such random number of successes has $Geometric(p)$ distribution. So $N_r$ is the sum of $r$ many $Geometric(p)$ random variables. This $N_r$ has the $Negative$ $Binomial$ distribution.



                So it is the other way round: the Geometric distribution gives rise to the Negative Binomial distribution.



                Expectation of the Negative Binomial distribution is just the sum of expectations of $r$ many Geometric($p$) random variables. Each has expectation $dfracp1-p$, so our Negative Binomial has expectation $dfracrp1-p$.



                Since the Geometric random variables are independent, variance of Negative Binomial is sum of variances of $r$ many Geometric($p$) random variables. A $Geometric(p)$ r.v. has variance $dfracp(1-p)^2$, so our Negative Binomial has variance $dfracrp(1-p)^2$.







                share|cite|improve this answer












                share|cite|improve this answer



                share|cite|improve this answer










                answered Dec 24 '16 at 16:58









                Landon CarterLandon Carter

                7,45311644




                7,45311644





















                    0












                    $begingroup$

                    The (0 based) geometric distribution is that of the count of failures before the first success in an indefinite sequence of independent Bernoulli trials with identical success rate.



                    A negative binomial distribution is that of the count of successes before a specified number of failures occurs in an indefinite sequence of independent Bernoulli trials with identical success rate.



                    These definitions are clearly inter related.   You can derive one from the other, or both together from first principles.



                    It all depends on what seed you have been given.




                    Let $X_i$ be a geometric random variable with success rate, $1-p$.   Then by applying the above definition it is apparent that $X_i$ has a negative binomial distribution the count of 'successes' before 1 'failure', with 'failure' rate $1-p$.



                    $$X_isimmathcal Geo_0(1-p) iff X_i~sim~mathcalNegBin(1, p)$$



                    So if you are given the probability mass function, expectation, and variance, for a general negative binomial, you can immediately find the probability mass function, expectation, and variance for a geometric random variable.




                    Let $Y_r$ be a negative binomial random variable with success rate, $p$, and specified number of successes $r$.   Then $Y_r$ is the sum of $r$ independent geometric distributions with identical success rate $1-p$.   (Can you see why?)



                    $$Y_rsimmathcalNegBin(r, p)~iff~ Y_r=sum_i=1^r X_i~wedge~ bigl(X_ibigr)_i=1^roversetrm iidsimmathcalGeo_0(1-p)$$



                    So if you have been given the pmf for a geometric distribution, you can obtain the general pmf, expectation, and variance, of a negative binomial distribution, with just a little work.




                    So if you start with $mathsf E(X_1)=p(1-p)^-1, mathsf Var(X_1)=p(1-p)^-2$ because, $X_1simmathcalGeo_0(1-p)$ then...




                    $$beginalignmathsf E(Y_r) ~&=~ sum_i=1^rmathsf E(X_i) \[1ex] &=~ rmathsf E(X_1) \[1ex] ~&=~ rp(1-p)^-1\[2ex]mathsf Var(Y_r) ~&=~ sum_i=1^rmathsf Var(X_i)+2sum_1leq i<jleq rmathsfCov(X_i,X_j)\[1ex] &=~ rmathsfVar(X_1) \[1ex] &=~ rp(1-p)^-2endalign$$







                    share|cite|improve this answer











                    $endgroup$












                    • $begingroup$
                      Thanks for this explanation it makes a lot of sense. I am going to use part of it as an explanation in a uni report. Is citing this an answer in this forum an acceptable reference or do I need to use a book/journal? If so do you have a suggestion of where I can find a similar explanation in literature?
                      $endgroup$
                      – Hiboa4
                      Dec 24 '16 at 22:22















                    0












                    $begingroup$

                    The (0 based) geometric distribution is that of the count of failures before the first success in an indefinite sequence of independent Bernoulli trials with identical success rate.



                    A negative binomial distribution is that of the count of successes before a specified number of failures occurs in an indefinite sequence of independent Bernoulli trials with identical success rate.



                    These definitions are clearly inter related.   You can derive one from the other, or both together from first principles.



                    It all depends on what seed you have been given.




                    Let $X_i$ be a geometric random variable with success rate, $1-p$.   Then by applying the above definition it is apparent that $X_i$ has a negative binomial distribution the count of 'successes' before 1 'failure', with 'failure' rate $1-p$.



                    $$X_isimmathcal Geo_0(1-p) iff X_i~sim~mathcalNegBin(1, p)$$



                    So if you are given the probability mass function, expectation, and variance, for a general negative binomial, you can immediately find the probability mass function, expectation, and variance for a geometric random variable.




                    Let $Y_r$ be a negative binomial random variable with success rate, $p$, and specified number of successes $r$.   Then $Y_r$ is the sum of $r$ independent geometric distributions with identical success rate $1-p$.   (Can you see why?)



                    $$Y_rsimmathcalNegBin(r, p)~iff~ Y_r=sum_i=1^r X_i~wedge~ bigl(X_ibigr)_i=1^roversetrm iidsimmathcalGeo_0(1-p)$$



                    So if you have been given the pmf for a geometric distribution, you can obtain the general pmf, expectation, and variance, of a negative binomial distribution, with just a little work.




                    So if you start with $mathsf E(X_1)=p(1-p)^-1, mathsf Var(X_1)=p(1-p)^-2$ because, $X_1simmathcalGeo_0(1-p)$ then...




                    $$beginalignmathsf E(Y_r) ~&=~ sum_i=1^rmathsf E(X_i) \[1ex] &=~ rmathsf E(X_1) \[1ex] ~&=~ rp(1-p)^-1\[2ex]mathsf Var(Y_r) ~&=~ sum_i=1^rmathsf Var(X_i)+2sum_1leq i<jleq rmathsfCov(X_i,X_j)\[1ex] &=~ rmathsfVar(X_1) \[1ex] &=~ rp(1-p)^-2endalign$$







                    share|cite|improve this answer











                    $endgroup$












                    • $begingroup$
                      Thanks for this explanation it makes a lot of sense. I am going to use part of it as an explanation in a uni report. Is citing this an answer in this forum an acceptable reference or do I need to use a book/journal? If so do you have a suggestion of where I can find a similar explanation in literature?
                      $endgroup$
                      – Hiboa4
                      Dec 24 '16 at 22:22













                    0












                    0








                    0





                    $begingroup$

                    The (0 based) geometric distribution is that of the count of failures before the first success in an indefinite sequence of independent Bernoulli trials with identical success rate.



                    A negative binomial distribution is that of the count of successes before a specified number of failures occurs in an indefinite sequence of independent Bernoulli trials with identical success rate.



                    These definitions are clearly inter related.   You can derive one from the other, or both together from first principles.



                    It all depends on what seed you have been given.




                    Let $X_i$ be a geometric random variable with success rate, $1-p$.   Then by applying the above definition it is apparent that $X_i$ has a negative binomial distribution the count of 'successes' before 1 'failure', with 'failure' rate $1-p$.



                    $$X_isimmathcal Geo_0(1-p) iff X_i~sim~mathcalNegBin(1, p)$$



                    So if you are given the probability mass function, expectation, and variance, for a general negative binomial, you can immediately find the probability mass function, expectation, and variance for a geometric random variable.




                    Let $Y_r$ be a negative binomial random variable with success rate, $p$, and specified number of successes $r$.   Then $Y_r$ is the sum of $r$ independent geometric distributions with identical success rate $1-p$.   (Can you see why?)



                    $$Y_rsimmathcalNegBin(r, p)~iff~ Y_r=sum_i=1^r X_i~wedge~ bigl(X_ibigr)_i=1^roversetrm iidsimmathcalGeo_0(1-p)$$



                    So if you have been given the pmf for a geometric distribution, you can obtain the general pmf, expectation, and variance, of a negative binomial distribution, with just a little work.




                    So if you start with $mathsf E(X_1)=p(1-p)^-1, mathsf Var(X_1)=p(1-p)^-2$ because, $X_1simmathcalGeo_0(1-p)$ then...




                    $$beginalignmathsf E(Y_r) ~&=~ sum_i=1^rmathsf E(X_i) \[1ex] &=~ rmathsf E(X_1) \[1ex] ~&=~ rp(1-p)^-1\[2ex]mathsf Var(Y_r) ~&=~ sum_i=1^rmathsf Var(X_i)+2sum_1leq i<jleq rmathsfCov(X_i,X_j)\[1ex] &=~ rmathsfVar(X_1) \[1ex] &=~ rp(1-p)^-2endalign$$







                    share|cite|improve this answer











                    $endgroup$



                    The (0 based) geometric distribution is that of the count of failures before the first success in an indefinite sequence of independent Bernoulli trials with identical success rate.



                    A negative binomial distribution is that of the count of successes before a specified number of failures occurs in an indefinite sequence of independent Bernoulli trials with identical success rate.



                    These definitions are clearly inter related.   You can derive one from the other, or both together from first principles.



                    It all depends on what seed you have been given.




                    Let $X_i$ be a geometric random variable with success rate, $1-p$.   Then by applying the above definition it is apparent that $X_i$ has a negative binomial distribution the count of 'successes' before 1 'failure', with 'failure' rate $1-p$.



                    $$X_isimmathcal Geo_0(1-p) iff X_i~sim~mathcalNegBin(1, p)$$



                    So if you are given the probability mass function, expectation, and variance, for a general negative binomial, you can immediately find the probability mass function, expectation, and variance for a geometric random variable.




                    Let $Y_r$ be a negative binomial random variable with success rate, $p$, and specified number of successes $r$.   Then $Y_r$ is the sum of $r$ independent geometric distributions with identical success rate $1-p$.   (Can you see why?)



                    $$Y_rsimmathcalNegBin(r, p)~iff~ Y_r=sum_i=1^r X_i~wedge~ bigl(X_ibigr)_i=1^roversetrm iidsimmathcalGeo_0(1-p)$$



                    So if you have been given the pmf for a geometric distribution, you can obtain the general pmf, expectation, and variance, of a negative binomial distribution, with just a little work.




                    So if you start with $mathsf E(X_1)=p(1-p)^-1, mathsf Var(X_1)=p(1-p)^-2$ because, $X_1simmathcalGeo_0(1-p)$ then...




                    $$beginalignmathsf E(Y_r) ~&=~ sum_i=1^rmathsf E(X_i) \[1ex] &=~ rmathsf E(X_1) \[1ex] ~&=~ rp(1-p)^-1\[2ex]mathsf Var(Y_r) ~&=~ sum_i=1^rmathsf Var(X_i)+2sum_1leq i<jleq rmathsfCov(X_i,X_j)\[1ex] &=~ rmathsfVar(X_1) \[1ex] &=~ rp(1-p)^-2endalign$$








                    share|cite|improve this answer














                    share|cite|improve this answer



                    share|cite|improve this answer








                    edited Dec 24 '16 at 17:36

























                    answered Dec 24 '16 at 17:24









                    Graham KempGraham Kemp

                    87.2k43579




                    87.2k43579











                    • $begingroup$
                      Thanks for this explanation it makes a lot of sense. I am going to use part of it as an explanation in a uni report. Is citing this an answer in this forum an acceptable reference or do I need to use a book/journal? If so do you have a suggestion of where I can find a similar explanation in literature?
                      $endgroup$
                      – Hiboa4
                      Dec 24 '16 at 22:22
















                    • $begingroup$
                      Thanks for this explanation it makes a lot of sense. I am going to use part of it as an explanation in a uni report. Is citing this an answer in this forum an acceptable reference or do I need to use a book/journal? If so do you have a suggestion of where I can find a similar explanation in literature?
                      $endgroup$
                      – Hiboa4
                      Dec 24 '16 at 22:22















                    $begingroup$
                    Thanks for this explanation it makes a lot of sense. I am going to use part of it as an explanation in a uni report. Is citing this an answer in this forum an acceptable reference or do I need to use a book/journal? If so do you have a suggestion of where I can find a similar explanation in literature?
                    $endgroup$
                    – Hiboa4
                    Dec 24 '16 at 22:22




                    $begingroup$
                    Thanks for this explanation it makes a lot of sense. I am going to use part of it as an explanation in a uni report. Is citing this an answer in this forum an acceptable reference or do I need to use a book/journal? If so do you have a suggestion of where I can find a similar explanation in literature?
                    $endgroup$
                    – Hiboa4
                    Dec 24 '16 at 22:22

















                    draft saved

                    draft discarded
















































                    Thanks for contributing an answer to Mathematics Stack Exchange!


                    • Please be sure to answer the question. Provide details and share your research!

                    But avoid


                    • Asking for help, clarification, or responding to other answers.

                    • Making statements based on opinion; back them up with references or personal experience.

                    Use MathJax to format equations. MathJax reference.


                    To learn more, see our tips on writing great answers.




                    draft saved


                    draft discarded














                    StackExchange.ready(
                    function ()
                    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2070620%2fhow-can-the-negative-binomial-distribution-be-derived-from-another-more-element%23new-answer', 'question_page');

                    );

                    Post as a guest















                    Required, but never shown





















































                    Required, but never shown














                    Required, but never shown












                    Required, but never shown







                    Required, but never shown

































                    Required, but never shown














                    Required, but never shown












                    Required, but never shown







                    Required, but never shown







                    Popular posts from this blog

                    Solar Wings Breeze Design and development Specifications (Breeze) References Navigation menu1368-485X"Hang glider: Breeze (Solar Wings)"e

                    Kathakali Contents Etymology and nomenclature History Repertoire Songs and musical instruments Traditional plays Styles: Sampradayam Training centers and awards Relationship to other dance forms See also Notes References External links Navigation menueThe Illustrated Encyclopedia of Hinduism: A-MSouth Asian Folklore: An EncyclopediaRoutledge International Encyclopedia of Women: Global Women's Issues and KnowledgeKathakali Dance-drama: Where Gods and Demons Come to PlayKathakali Dance-drama: Where Gods and Demons Come to PlayKathakali Dance-drama: Where Gods and Demons Come to Play10.1353/atj.2005.0004The Illustrated Encyclopedia of Hinduism: A-MEncyclopedia of HinduismKathakali Dance-drama: Where Gods and Demons Come to PlaySonic Liturgy: Ritual and Music in Hindu Tradition"The Mirror of Gesture"Kathakali Dance-drama: Where Gods and Demons Come to Play"Kathakali"Indian Theatre: Traditions of PerformanceIndian Theatre: Traditions of PerformanceIndian Theatre: Traditions of PerformanceIndian Theatre: Traditions of PerformanceMedieval Indian Literature: An AnthologyThe Oxford Companion to Indian TheatreSouth Asian Folklore: An Encyclopedia : Afghanistan, Bangladesh, India, Nepal, Pakistan, Sri LankaThe Rise of Performance Studies: Rethinking Richard Schechner's Broad SpectrumIndian Theatre: Traditions of PerformanceModern Asian Theatre and Performance 1900-2000Critical Theory and PerformanceBetween Theater and AnthropologyKathakali603847011Indian Theatre: Traditions of PerformanceIndian Theatre: Traditions of PerformanceIndian Theatre: Traditions of PerformanceBetween Theater and AnthropologyBetween Theater and AnthropologyNambeesan Smaraka AwardsArchivedThe Cambridge Guide to TheatreRoutledge International Encyclopedia of Women: Global Women's Issues and KnowledgeThe Garland Encyclopedia of World Music: South Asia : the Indian subcontinentThe Ethos of Noh: Actors and Their Art10.2307/1145740By Means of Performance: Intercultural Studies of Theatre and Ritual10.1017/s204912550000100xReconceiving the Renaissance: A Critical ReaderPerformance TheoryListening to Theatre: The Aural Dimension of Beijing Opera10.2307/1146013Kathakali: The Art of the Non-WorldlyOn KathakaliKathakali, the dance theatreThe Kathakali Complex: Performance & StructureKathakali Dance-Drama: Where Gods and Demons Come to Play10.1093/obo/9780195399318-0071Drama and Ritual of Early Hinduism"In the Shadow of Hollywood Orientalism: Authentic East Indian Dancing"10.1080/08949460490274013Sanskrit Play Production in Ancient IndiaIndian Music: History and StructureBharata, the Nāṭyaśāstra233639306Table of Contents2238067286469807Dance In Indian Painting10.2307/32047833204783Kathakali Dance-Theatre: A Visual Narrative of Sacred Indian MimeIndian Classical Dance: The Renaissance and BeyondKathakali: an indigenous art-form of Keralaeee

                    Method to test if a number is a perfect power? Announcing the arrival of Valued Associate #679: Cesar Manara Planned maintenance scheduled April 23, 2019 at 00:00UTC (8:00pm US/Eastern)Detecting perfect squares faster than by extracting square rooteffective way to get the integer sequence A181392 from oeisA rarely mentioned fact about perfect powersHow many numbers such $n$ are there that $n<100,lfloorsqrtn rfloor mid n$Check perfect squareness by modulo division against multiple basesFor what pair of integers $(a,b)$ is $3^a + 7^b$ a perfect square.Do there exist any positive integers $n$ such that $lfloore^nrfloor$ is a perfect power? What is the probability that one exists?finding perfect power factors of an integerProve that the sequence contains a perfect square for any natural number $m $ in the domain of $f$ .Counting Perfect Powers