Distribution of Maximum Likelihood EstimatorMaximum likelihood of function of the mean on a restricted parameter spaceMaximum Likelihood Estimator (MLE)Maximum Likelihood Estimator of the exponential function parameter based on Order StatisticsFind maximum likelihood estimateMaximum Likelihood Estimation in case of some specific uniform distributionsMaximum likelihood estimate for a univariate gaussianFind the Maximum Likelihood Estimator given two pdfsMaximum Likelihood Estimate for a likelihood defined by partsVariance of distribution for maximum likelihood estimatorHow is $P(D;theta) = P(D|theta)$?

Why does the Sun have different day lengths, but not the gas giants?

Non-trope happy ending?

Extract more than nine arguments that occur periodically in a sentence to use in macros in order to typset

Can a College of Swords bard use a Blade Flourish option on an opportunity attack provoked by their own Dissonant Whispers spell?

How can "mimic phobia" be cured or prevented?

Using substitution ciphers to generate new alphabets in a novel

On a tidally locked planet, would time be quantized?

Is there a way to get `mathscr' with lower case letters in pdfLaTeX?

Is this toilet slogan correct usage of the English language?

Open a doc from terminal, but not by its name

Calculating total slots

What are the advantages of simplicial model categories over non-simplicial ones?

Can I still be respawned if I die by falling off the map?

Mixing PEX brands

Electoral considerations aside, what are potential benefits, for the US, of policy changes proposed by the tweet recognizing Golan annexation?

Recommended PCB layout understanding - ADM2572 datasheet

Can the US President recognize Israel’s sovereignty over the Golan Heights for the USA or does that need an act of Congress?

Why should universal income be universal?

What are some good ways to treat frozen vegetables such that they behave like fresh vegetables when stir frying them?

Why can Carol Danvers change her suit colours in the first place?

Do the primes contain an infinite almost arithmetic progression?

How much character growth crosses the line into breaking the character

Why does AES have exactly 10 rounds for a 128-bit key, 12 for 192 bits and 14 for a 256-bit key size?

Why is the "ls" command showing permissions of files in a FAT32 partition?



Distribution of Maximum Likelihood Estimator


Maximum likelihood of function of the mean on a restricted parameter spaceMaximum Likelihood Estimator (MLE)Maximum Likelihood Estimator of the exponential function parameter based on Order StatisticsFind maximum likelihood estimateMaximum Likelihood Estimation in case of some specific uniform distributionsMaximum likelihood estimate for a univariate gaussianFind the Maximum Likelihood Estimator given two pdfsMaximum Likelihood Estimate for a likelihood defined by partsVariance of distribution for maximum likelihood estimatorHow is $P(D;theta) = P(D|theta)$?













1












$begingroup$


Why is the Maximum Likelihood Estimator Normally distributed? I can't figure out why it is true for large n in general. My attempt (for single parameter)



Let $L(theta)$ be the maximum likelihood function for the distribution $f(x;theta)$



Then after taking sample of size n



$$L(theta)=f(x_1;theta)cdot f(x_2theta)...f(x_n;theta)$$



And we want to find $theta_max$ such that $L(theta)$ is maximized and $theta_max$ is our estimate (once a sample has actually been selected)



Since $theta_max$ maximizes $L(theta)$ it also maximizes $ln(L(theta))$



where



$$ln(L(theta))=ln(f(x_1;theta))+ln(f(x_2;theta))...+ln(f(x_n;theta))$$



Taking the derivative with respect to $theta$



$$fracf'(x_1;theta)f(x_1;theta)+fracf'(x_2;theta)f(x_2;theta)...+fracf'(x_n;theta)f(x_n;theta)$$



$theta_max$ would be the solution of the above when set to 0 (after selecting values for all $x_1,x_2...x_n$) but why is it normally distributed and how do I show that it's true for large n?










share|cite|improve this question









$endgroup$
















    1












    $begingroup$


    Why is the Maximum Likelihood Estimator Normally distributed? I can't figure out why it is true for large n in general. My attempt (for single parameter)



    Let $L(theta)$ be the maximum likelihood function for the distribution $f(x;theta)$



    Then after taking sample of size n



    $$L(theta)=f(x_1;theta)cdot f(x_2theta)...f(x_n;theta)$$



    And we want to find $theta_max$ such that $L(theta)$ is maximized and $theta_max$ is our estimate (once a sample has actually been selected)



    Since $theta_max$ maximizes $L(theta)$ it also maximizes $ln(L(theta))$



    where



    $$ln(L(theta))=ln(f(x_1;theta))+ln(f(x_2;theta))...+ln(f(x_n;theta))$$



    Taking the derivative with respect to $theta$



    $$fracf'(x_1;theta)f(x_1;theta)+fracf'(x_2;theta)f(x_2;theta)...+fracf'(x_n;theta)f(x_n;theta)$$



    $theta_max$ would be the solution of the above when set to 0 (after selecting values for all $x_1,x_2...x_n$) but why is it normally distributed and how do I show that it's true for large n?










    share|cite|improve this question









    $endgroup$














      1












      1








      1





      $begingroup$


      Why is the Maximum Likelihood Estimator Normally distributed? I can't figure out why it is true for large n in general. My attempt (for single parameter)



      Let $L(theta)$ be the maximum likelihood function for the distribution $f(x;theta)$



      Then after taking sample of size n



      $$L(theta)=f(x_1;theta)cdot f(x_2theta)...f(x_n;theta)$$



      And we want to find $theta_max$ such that $L(theta)$ is maximized and $theta_max$ is our estimate (once a sample has actually been selected)



      Since $theta_max$ maximizes $L(theta)$ it also maximizes $ln(L(theta))$



      where



      $$ln(L(theta))=ln(f(x_1;theta))+ln(f(x_2;theta))...+ln(f(x_n;theta))$$



      Taking the derivative with respect to $theta$



      $$fracf'(x_1;theta)f(x_1;theta)+fracf'(x_2;theta)f(x_2;theta)...+fracf'(x_n;theta)f(x_n;theta)$$



      $theta_max$ would be the solution of the above when set to 0 (after selecting values for all $x_1,x_2...x_n$) but why is it normally distributed and how do I show that it's true for large n?










      share|cite|improve this question









      $endgroup$




      Why is the Maximum Likelihood Estimator Normally distributed? I can't figure out why it is true for large n in general. My attempt (for single parameter)



      Let $L(theta)$ be the maximum likelihood function for the distribution $f(x;theta)$



      Then after taking sample of size n



      $$L(theta)=f(x_1;theta)cdot f(x_2theta)...f(x_n;theta)$$



      And we want to find $theta_max$ such that $L(theta)$ is maximized and $theta_max$ is our estimate (once a sample has actually been selected)



      Since $theta_max$ maximizes $L(theta)$ it also maximizes $ln(L(theta))$



      where



      $$ln(L(theta))=ln(f(x_1;theta))+ln(f(x_2;theta))...+ln(f(x_n;theta))$$



      Taking the derivative with respect to $theta$



      $$fracf'(x_1;theta)f(x_1;theta)+fracf'(x_2;theta)f(x_2;theta)...+fracf'(x_n;theta)f(x_n;theta)$$



      $theta_max$ would be the solution of the above when set to 0 (after selecting values for all $x_1,x_2...x_n$) but why is it normally distributed and how do I show that it's true for large n?







      probability distributions normal-distribution estimation sampling






      share|cite|improve this question













      share|cite|improve this question











      share|cite|improve this question




      share|cite|improve this question










      asked Mar 15 at 0:50









      Colin HicksColin Hicks

      1353




      1353




















          1 Answer
          1






          active

          oldest

          votes


















          4












          $begingroup$

          MLE requires $$fracpartial ln L(theta)partial theta = sum_i=1^n frac f'(x_i;theta)f(x_i;theta),$$
          where $f'(x_i;theta)$ could denote a gradient (allowing for the multivariate case, but still sticking to your notation). Define a new function $g(x;theta)=frac f'(x;theta)f(x;theta).$ Then $g(x_i;theta)_i=1^n$ is a new iid sequence of random variables, with $Eg(x_1;theta)=0$. If $Eg(x_1;theta)g(x_1;theta)'<infty$, CLT implies,
          $$sqrtn(barg_n(theta)-Eg(x_1;theta))=sqrtnbarg_n(theta) rightarrow_D N(0,E(g(x;theta)g(x;theta)'),$$
          where $barg_n(theta)=frac1n sum_i=1^n g(x_i;theta).$ The ML estimator solves the equation
          $$barg_n(theta)=0.$$
          It follows that the ML estimator is given by
          $$hattheta=barg_n^-1(0).$$
          So long as the set of discontinuity points of $barg_n^-1(z)$, i.e. the set of all values of $z$ such that $barg_n^-1(z)$ is not continuous, occur with probability zero, the continuous mapping theorem gives us asymptotic normality of $theta$.






          share|cite|improve this answer











          $endgroup$












          • $begingroup$
            $fracf'(x)f(x)$ is just another function of x so central limit theorem applies thank you for that
            $endgroup$
            – Colin Hicks
            Mar 15 at 1:23











          • $begingroup$
            You're welcome :)
            $endgroup$
            – dlnB
            Mar 15 at 1:26










          Your Answer





          StackExchange.ifUsing("editor", function ()
          return StackExchange.using("mathjaxEditing", function ()
          StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
          StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
          );
          );
          , "mathjax-editing");

          StackExchange.ready(function()
          var channelOptions =
          tags: "".split(" "),
          id: "65"
          ;
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function()
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled)
          StackExchange.using("snippets", function()
          createEditor();
          );

          else
          createEditor();

          );

          function createEditor()
          StackExchange.prepareEditor(
          heartbeatType: 'answer',
          autoActivateHeartbeat: false,
          convertImagesToLinks: false,
          noModals: true,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: null,
          bindNavPrevention: true,
          postfix: "",
          imageUploader:
          brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
          contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
          allowUrls: true
          ,
          onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          );



          );













          draft saved

          draft discarded


















          StackExchange.ready(
          function ()
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f397619%2fdistribution-of-maximum-likelihood-estimator%23new-answer', 'question_page');

          );

          Post as a guest















          Required, but never shown

























          1 Answer
          1






          active

          oldest

          votes








          1 Answer
          1






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes









          4












          $begingroup$

          MLE requires $$fracpartial ln L(theta)partial theta = sum_i=1^n frac f'(x_i;theta)f(x_i;theta),$$
          where $f'(x_i;theta)$ could denote a gradient (allowing for the multivariate case, but still sticking to your notation). Define a new function $g(x;theta)=frac f'(x;theta)f(x;theta).$ Then $g(x_i;theta)_i=1^n$ is a new iid sequence of random variables, with $Eg(x_1;theta)=0$. If $Eg(x_1;theta)g(x_1;theta)'<infty$, CLT implies,
          $$sqrtn(barg_n(theta)-Eg(x_1;theta))=sqrtnbarg_n(theta) rightarrow_D N(0,E(g(x;theta)g(x;theta)'),$$
          where $barg_n(theta)=frac1n sum_i=1^n g(x_i;theta).$ The ML estimator solves the equation
          $$barg_n(theta)=0.$$
          It follows that the ML estimator is given by
          $$hattheta=barg_n^-1(0).$$
          So long as the set of discontinuity points of $barg_n^-1(z)$, i.e. the set of all values of $z$ such that $barg_n^-1(z)$ is not continuous, occur with probability zero, the continuous mapping theorem gives us asymptotic normality of $theta$.






          share|cite|improve this answer











          $endgroup$












          • $begingroup$
            $fracf'(x)f(x)$ is just another function of x so central limit theorem applies thank you for that
            $endgroup$
            – Colin Hicks
            Mar 15 at 1:23











          • $begingroup$
            You're welcome :)
            $endgroup$
            – dlnB
            Mar 15 at 1:26















          4












          $begingroup$

          MLE requires $$fracpartial ln L(theta)partial theta = sum_i=1^n frac f'(x_i;theta)f(x_i;theta),$$
          where $f'(x_i;theta)$ could denote a gradient (allowing for the multivariate case, but still sticking to your notation). Define a new function $g(x;theta)=frac f'(x;theta)f(x;theta).$ Then $g(x_i;theta)_i=1^n$ is a new iid sequence of random variables, with $Eg(x_1;theta)=0$. If $Eg(x_1;theta)g(x_1;theta)'<infty$, CLT implies,
          $$sqrtn(barg_n(theta)-Eg(x_1;theta))=sqrtnbarg_n(theta) rightarrow_D N(0,E(g(x;theta)g(x;theta)'),$$
          where $barg_n(theta)=frac1n sum_i=1^n g(x_i;theta).$ The ML estimator solves the equation
          $$barg_n(theta)=0.$$
          It follows that the ML estimator is given by
          $$hattheta=barg_n^-1(0).$$
          So long as the set of discontinuity points of $barg_n^-1(z)$, i.e. the set of all values of $z$ such that $barg_n^-1(z)$ is not continuous, occur with probability zero, the continuous mapping theorem gives us asymptotic normality of $theta$.






          share|cite|improve this answer











          $endgroup$












          • $begingroup$
            $fracf'(x)f(x)$ is just another function of x so central limit theorem applies thank you for that
            $endgroup$
            – Colin Hicks
            Mar 15 at 1:23











          • $begingroup$
            You're welcome :)
            $endgroup$
            – dlnB
            Mar 15 at 1:26













          4












          4








          4





          $begingroup$

          MLE requires $$fracpartial ln L(theta)partial theta = sum_i=1^n frac f'(x_i;theta)f(x_i;theta),$$
          where $f'(x_i;theta)$ could denote a gradient (allowing for the multivariate case, but still sticking to your notation). Define a new function $g(x;theta)=frac f'(x;theta)f(x;theta).$ Then $g(x_i;theta)_i=1^n$ is a new iid sequence of random variables, with $Eg(x_1;theta)=0$. If $Eg(x_1;theta)g(x_1;theta)'<infty$, CLT implies,
          $$sqrtn(barg_n(theta)-Eg(x_1;theta))=sqrtnbarg_n(theta) rightarrow_D N(0,E(g(x;theta)g(x;theta)'),$$
          where $barg_n(theta)=frac1n sum_i=1^n g(x_i;theta).$ The ML estimator solves the equation
          $$barg_n(theta)=0.$$
          It follows that the ML estimator is given by
          $$hattheta=barg_n^-1(0).$$
          So long as the set of discontinuity points of $barg_n^-1(z)$, i.e. the set of all values of $z$ such that $barg_n^-1(z)$ is not continuous, occur with probability zero, the continuous mapping theorem gives us asymptotic normality of $theta$.






          share|cite|improve this answer











          $endgroup$



          MLE requires $$fracpartial ln L(theta)partial theta = sum_i=1^n frac f'(x_i;theta)f(x_i;theta),$$
          where $f'(x_i;theta)$ could denote a gradient (allowing for the multivariate case, but still sticking to your notation). Define a new function $g(x;theta)=frac f'(x;theta)f(x;theta).$ Then $g(x_i;theta)_i=1^n$ is a new iid sequence of random variables, with $Eg(x_1;theta)=0$. If $Eg(x_1;theta)g(x_1;theta)'<infty$, CLT implies,
          $$sqrtn(barg_n(theta)-Eg(x_1;theta))=sqrtnbarg_n(theta) rightarrow_D N(0,E(g(x;theta)g(x;theta)'),$$
          where $barg_n(theta)=frac1n sum_i=1^n g(x_i;theta).$ The ML estimator solves the equation
          $$barg_n(theta)=0.$$
          It follows that the ML estimator is given by
          $$hattheta=barg_n^-1(0).$$
          So long as the set of discontinuity points of $barg_n^-1(z)$, i.e. the set of all values of $z$ such that $barg_n^-1(z)$ is not continuous, occur with probability zero, the continuous mapping theorem gives us asymptotic normality of $theta$.







          share|cite|improve this answer














          share|cite|improve this answer



          share|cite|improve this answer








          edited Mar 15 at 2:12

























          answered Mar 15 at 1:16









          dlnBdlnB

          87711




          87711











          • $begingroup$
            $fracf'(x)f(x)$ is just another function of x so central limit theorem applies thank you for that
            $endgroup$
            – Colin Hicks
            Mar 15 at 1:23











          • $begingroup$
            You're welcome :)
            $endgroup$
            – dlnB
            Mar 15 at 1:26
















          • $begingroup$
            $fracf'(x)f(x)$ is just another function of x so central limit theorem applies thank you for that
            $endgroup$
            – Colin Hicks
            Mar 15 at 1:23











          • $begingroup$
            You're welcome :)
            $endgroup$
            – dlnB
            Mar 15 at 1:26















          $begingroup$
          $fracf'(x)f(x)$ is just another function of x so central limit theorem applies thank you for that
          $endgroup$
          – Colin Hicks
          Mar 15 at 1:23





          $begingroup$
          $fracf'(x)f(x)$ is just another function of x so central limit theorem applies thank you for that
          $endgroup$
          – Colin Hicks
          Mar 15 at 1:23













          $begingroup$
          You're welcome :)
          $endgroup$
          – dlnB
          Mar 15 at 1:26




          $begingroup$
          You're welcome :)
          $endgroup$
          – dlnB
          Mar 15 at 1:26

















          draft saved

          draft discarded
















































          Thanks for contributing an answer to Cross Validated!


          • Please be sure to answer the question. Provide details and share your research!

          But avoid


          • Asking for help, clarification, or responding to other answers.

          • Making statements based on opinion; back them up with references or personal experience.

          Use MathJax to format equations. MathJax reference.


          To learn more, see our tips on writing great answers.




          draft saved


          draft discarded














          StackExchange.ready(
          function ()
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f397619%2fdistribution-of-maximum-likelihood-estimator%23new-answer', 'question_page');

          );

          Post as a guest















          Required, but never shown





















































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown

































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown







          Popular posts from this blog

          How should I support this large drywall patch? Planned maintenance scheduled April 23, 2019 at 00:00UTC (8:00pm US/Eastern) Announcing the arrival of Valued Associate #679: Cesar Manara Unicorn Meta Zoo #1: Why another podcast?How do I cover large gaps in drywall?How do I keep drywall around a patch from crumbling?Can I glue a second layer of drywall?How to patch long strip on drywall?Large drywall patch: how to avoid bulging seams?Drywall Mesh Patch vs. Bulge? To remove or not to remove?How to fix this drywall job?Prep drywall before backsplashWhat's the best way to fix this horrible drywall patch job?Drywall patching using 3M Patch Plus Primer

          random experiment with two different functions on unit interval Announcing the arrival of Valued Associate #679: Cesar Manara Planned maintenance scheduled April 23, 2019 at 00:00UTC (8:00pm US/Eastern)Random variable and probability space notionsRandom Walk with EdgesFinding functions where the increase over a random interval is Poisson distributedNumber of days until dayCan an observed event in fact be of zero probability?Unit random processmodels of coins and uniform distributionHow to get the number of successes given $n$ trials , probability $P$ and a random variable $X$Absorbing Markov chain in a computer. Is “almost every” turned into always convergence in computer executions?Stopped random walk is not uniformly integrable

          Lowndes Grove History Architecture References Navigation menu32°48′6″N 79°57′58″W / 32.80167°N 79.96611°W / 32.80167; -79.9661132°48′6″N 79°57′58″W / 32.80167°N 79.96611°W / 32.80167; -79.9661178002500"National Register Information System"Historic houses of South Carolina"Lowndes Grove""+32° 48' 6.00", −79° 57' 58.00""Lowndes Grove, Charleston County (260 St. Margaret St., Charleston)""Lowndes Grove"The Charleston ExpositionIt Happened in South Carolina"Lowndes Grove (House), Saint Margaret Street & Sixth Avenue, Charleston, Charleston County, SC(Photographs)"Plantations of the Carolina Low Countrye