Optimal estimator of Poisson distributionWhat probability distribution is this?Bounds on least squares and weighted least squares estimatorFind the maximum of an integral function with respect to another functionGiven a Poisson-noisy signal, what is the noise distribution of its Fourier transform?Optimal estimation of the fusion of two measurementsComputing mean square error for linear transformThe distribution of a complex signalproblems and solutions for simple and harder optimal control theory?2d bimodal distribution filterCalculating the Mean Square Error (MSE) in Wavelet Denoising

Why does Kotter return in Welcome Back Kotter?

Why can't we play rap on piano?

Are the number of citations and number of published articles the most important criteria for a tenure promotion?

Rock identification in KY

Malformed Address '10.10.21.08/24', must be X.X.X.X/NN or

tikz convert color string to hex value

What would happen to a modern skyscraper if it rains micro blackholes?

Can a vampire attack twice with their claws using Multiattack?

What is a clear way to write a bar that has an extra beat?

Are astronomers waiting to see something in an image from a gravitational lens that they've already seen in an adjacent image?

Why is Minecraft giving an OpenGL error?

How does quantile regression compare to logistic regression with the variable split at the quantile?

Why are electrically insulating heatsinks so rare? Is it just cost?

How can I make my BBEG immortal short of making them a Lich or Vampire?

Is it inappropriate for a student to attend their mentor's dissertation defense?

Maximum likelihood parameters deviate from posterior distributions

LaTeX: Why are digits allowed in environments, but forbidden in commands?

Paid for article while in US on F-1 visa?

Add text to same line using sed

Is it possible to do 50 km distance without any previous training?

Intersection point of 2 lines defined by 2 points each

What typically incentivizes a professor to change jobs to a lower ranking university?

Can I ask the recruiters in my resume to put the reason why I am rejected?

Convert two switches to a dual stack, and add outlet - possible here?



Optimal estimator of Poisson distribution


What probability distribution is this?Bounds on least squares and weighted least squares estimatorFind the maximum of an integral function with respect to another functionGiven a Poisson-noisy signal, what is the noise distribution of its Fourier transform?Optimal estimation of the fusion of two measurementsComputing mean square error for linear transformThe distribution of a complex signalproblems and solutions for simple and harder optimal control theory?2d bimodal distribution filterCalculating the Mean Square Error (MSE) in Wavelet Denoising













2












$begingroup$


As part of my homework, I was asked to solve the following:




a signal and noise are stochastically independent and it is known that $nsim Poiss(mu_n)$ and $xsim Poiss(mu_x)$ we define $y=x+n$



  1. Find the optimal estimator(MMSE) of $y$ given $x$.

  2. Find the linear optimal estimator(MMSE) of $x$ given $y$.



I'm not familiar with optimal estimators, and I couldn't find a good source to study it.



Any help will be appreciated.










share|cite|improve this question











$endgroup$











  • $begingroup$
    Since you mention MMSE, perhaps going through the wiki page might help.
    $endgroup$
    – StubbornAtom
    Mar 21 at 19:12















2












$begingroup$


As part of my homework, I was asked to solve the following:




a signal and noise are stochastically independent and it is known that $nsim Poiss(mu_n)$ and $xsim Poiss(mu_x)$ we define $y=x+n$



  1. Find the optimal estimator(MMSE) of $y$ given $x$.

  2. Find the linear optimal estimator(MMSE) of $x$ given $y$.



I'm not familiar with optimal estimators, and I couldn't find a good source to study it.



Any help will be appreciated.










share|cite|improve this question











$endgroup$











  • $begingroup$
    Since you mention MMSE, perhaps going through the wiki page might help.
    $endgroup$
    – StubbornAtom
    Mar 21 at 19:12













2












2








2


2



$begingroup$


As part of my homework, I was asked to solve the following:




a signal and noise are stochastically independent and it is known that $nsim Poiss(mu_n)$ and $xsim Poiss(mu_x)$ we define $y=x+n$



  1. Find the optimal estimator(MMSE) of $y$ given $x$.

  2. Find the linear optimal estimator(MMSE) of $x$ given $y$.



I'm not familiar with optimal estimators, and I couldn't find a good source to study it.



Any help will be appreciated.










share|cite|improve this question











$endgroup$




As part of my homework, I was asked to solve the following:




a signal and noise are stochastically independent and it is known that $nsim Poiss(mu_n)$ and $xsim Poiss(mu_x)$ we define $y=x+n$



  1. Find the optimal estimator(MMSE) of $y$ given $x$.

  2. Find the linear optimal estimator(MMSE) of $x$ given $y$.



I'm not familiar with optimal estimators, and I couldn't find a good source to study it.



Any help will be appreciated.







signal-processing online-resources






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Mar 24 at 15:36







segevp

















asked Mar 21 at 18:44









segevpsegevp

508621




508621











  • $begingroup$
    Since you mention MMSE, perhaps going through the wiki page might help.
    $endgroup$
    – StubbornAtom
    Mar 21 at 19:12
















  • $begingroup$
    Since you mention MMSE, perhaps going through the wiki page might help.
    $endgroup$
    – StubbornAtom
    Mar 21 at 19:12















$begingroup$
Since you mention MMSE, perhaps going through the wiki page might help.
$endgroup$
– StubbornAtom
Mar 21 at 19:12




$begingroup$
Since you mention MMSE, perhaps going through the wiki page might help.
$endgroup$
– StubbornAtom
Mar 21 at 19:12










1 Answer
1






active

oldest

votes


















2





+100







$begingroup$

(1) The MMSE is given by $widehaty=Eleft[y,middle|~xright]$ and the equation $y=x+n$ yields $$Eleft[y ,middle|~ x right] = Eleft[x ,middle|~ x right]+Eleft[n ,middle|~ x right] = x + E[n] = x+mu_n~,$$
where the second identity holds in view of the independence of $x$ and $n$.



In other words, $widehaty=x+mu_n$.



(2) The MMSE is given by $widehatx=Eleft[ x ,middle|~ y right]$. The relation $y=x+n$ yields $Eleft[x ,middle|~ y right] = y-Eleft[n ,middle|~ y right]$. As shown below, we have $Eleft[n ,middle|~ y right] = yleft(fracmu_nmu_n+mu_xright)$.



Therefore, $widehatx=yleft(fracmu_xmu_n+mu_xright)$.




We now show that $Eleft[n ,middle|~ y right]=yleft(fracmu_nmu_n+mu_xright)$. We have



$$Eleft[n ,middle|~ y=y' right]=sum_kleq y'kmathbbPleft(n=kleft|y=y'right.right)~,$$



and therefore



beginalign
mathbbPleft(n=k ,middle|~
y=y' right) & =fracmathbbPleft(n=k,,y=y'right)mathbbPleft(y=y'right) \
&=fracmathbbPleft(n=k,,x=y'-kright)mathbbPleft(y=y'right) \
&=fracmathbbPleft(n=kright)mathbbPleft(x=y'-kright)mathbbPleft(y=y'right) \
&= fracleft(e^-mu_nmu_n^k/k!right)left(e^-mu_xmu_x^y'-k/(y'-k)!right)left(e^-left(mu_n+mu_xright)left(mu_n+mu_xright)^y'/y'!right)=left(beginarrayc y' \ kendarrayright) left(fracmu_nmu_n+mu_xright)^kleft(fracmu_xmu_n+mu_xright)^y'-k~,
endalign



where the second to last equality above holds since $ysim sf Poisson(mu_n+mu_x)$, which follows from the independence of $x$ and $n$. In other words, for $y'$ fixed, $left(mathbbPleft(n=k ,middle|~y=y' right)right)_kleq y'$ is $sf Binomial(y',p)$ with $p= mu_n ,/(mu_n+mu_x)$ and thus



$$Eleft[n ,middle| y=y' right] = py' = y'left(fracmu_nmu_n+mu_xright)$$



That is, $Eleft[n ,middle| y right]=yleft(fracmu_nmu_n+mu_xright)$.






share|cite|improve this answer











$endgroup$








  • 1




    $begingroup$
    That is a great answer! thank you so much. If I may, I have another question related(this is where things get tricky!). If $mu_x=mu_x(theta)$, how would you calculate the fisher information of $x$ on $theta$, $J_x(theta)$?
    $endgroup$
    – segevp
    Mar 24 at 22:49










  • $begingroup$
    First, let me say that by reading the question again, I realized that this was a homework problem and that you were asking more for hints and references (so I should not have placed a full solution to your question!). Regarding the Fischer information, let us proceed by degrees. You have the expression for the Fischer information given by $mathcalJ_x(theta)=Eleft[left(fracpartialpartial theta log f(x;theta)right)^2left|thetaright.right]$ where $f$ is the distribution of $x$ with parameter $theta$. You know that $x$ is Poisson with mean $mu_x(theta)$.
    $endgroup$
    – Augusto S.
    Mar 24 at 23:39










  • $begingroup$
    Can you develop the expression above? $x$ is Poisson and hence, $f(x;theta)=e^-mu_x(theta)frac(mu_x(theta))^xx!$. Can you: (i) compute the log of this expression; (ii) compute the derivative with respect to $theta$ (I am assuming that $mu_x(theta)$ is differentiable); (iii) use the property for conditional expectations: $Eleft[Xg(Z)left|Zright.right]=g(Z)E[Xleft|Zright.]$ to further simplify the expression. See what is the expression that you get -- it will be given in terms of the derivative of $mu_x(theta)$ and $mu_x(theta)$ itself. Let me know if you have questions.
    $endgroup$
    – Augusto S.
    Mar 24 at 23:47











  • $begingroup$
    while this is true, you did helped me a lot with the homework, and I did learn much for that. I tried myself to solve it for days, so I did not had much choice. thank you again, everything is much clear now :)
    $endgroup$
    – segevp
    Mar 25 at 6:38






  • 1




    $begingroup$
    I am glad it helped!
    $endgroup$
    – Augusto S.
    Mar 25 at 7:07












Your Answer





StackExchange.ifUsing("editor", function ()
return StackExchange.using("mathjaxEditing", function ()
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
);
);
, "mathjax-editing");

StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "69"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);

else
createEditor();

);

function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);



);













draft saved

draft discarded


















StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3157230%2foptimal-estimator-of-poisson-distribution%23new-answer', 'question_page');

);

Post as a guest















Required, but never shown

























1 Answer
1






active

oldest

votes








1 Answer
1






active

oldest

votes









active

oldest

votes






active

oldest

votes









2





+100







$begingroup$

(1) The MMSE is given by $widehaty=Eleft[y,middle|~xright]$ and the equation $y=x+n$ yields $$Eleft[y ,middle|~ x right] = Eleft[x ,middle|~ x right]+Eleft[n ,middle|~ x right] = x + E[n] = x+mu_n~,$$
where the second identity holds in view of the independence of $x$ and $n$.



In other words, $widehaty=x+mu_n$.



(2) The MMSE is given by $widehatx=Eleft[ x ,middle|~ y right]$. The relation $y=x+n$ yields $Eleft[x ,middle|~ y right] = y-Eleft[n ,middle|~ y right]$. As shown below, we have $Eleft[n ,middle|~ y right] = yleft(fracmu_nmu_n+mu_xright)$.



Therefore, $widehatx=yleft(fracmu_xmu_n+mu_xright)$.




We now show that $Eleft[n ,middle|~ y right]=yleft(fracmu_nmu_n+mu_xright)$. We have



$$Eleft[n ,middle|~ y=y' right]=sum_kleq y'kmathbbPleft(n=kleft|y=y'right.right)~,$$



and therefore



beginalign
mathbbPleft(n=k ,middle|~
y=y' right) & =fracmathbbPleft(n=k,,y=y'right)mathbbPleft(y=y'right) \
&=fracmathbbPleft(n=k,,x=y'-kright)mathbbPleft(y=y'right) \
&=fracmathbbPleft(n=kright)mathbbPleft(x=y'-kright)mathbbPleft(y=y'right) \
&= fracleft(e^-mu_nmu_n^k/k!right)left(e^-mu_xmu_x^y'-k/(y'-k)!right)left(e^-left(mu_n+mu_xright)left(mu_n+mu_xright)^y'/y'!right)=left(beginarrayc y' \ kendarrayright) left(fracmu_nmu_n+mu_xright)^kleft(fracmu_xmu_n+mu_xright)^y'-k~,
endalign



where the second to last equality above holds since $ysim sf Poisson(mu_n+mu_x)$, which follows from the independence of $x$ and $n$. In other words, for $y'$ fixed, $left(mathbbPleft(n=k ,middle|~y=y' right)right)_kleq y'$ is $sf Binomial(y',p)$ with $p= mu_n ,/(mu_n+mu_x)$ and thus



$$Eleft[n ,middle| y=y' right] = py' = y'left(fracmu_nmu_n+mu_xright)$$



That is, $Eleft[n ,middle| y right]=yleft(fracmu_nmu_n+mu_xright)$.






share|cite|improve this answer











$endgroup$








  • 1




    $begingroup$
    That is a great answer! thank you so much. If I may, I have another question related(this is where things get tricky!). If $mu_x=mu_x(theta)$, how would you calculate the fisher information of $x$ on $theta$, $J_x(theta)$?
    $endgroup$
    – segevp
    Mar 24 at 22:49










  • $begingroup$
    First, let me say that by reading the question again, I realized that this was a homework problem and that you were asking more for hints and references (so I should not have placed a full solution to your question!). Regarding the Fischer information, let us proceed by degrees. You have the expression for the Fischer information given by $mathcalJ_x(theta)=Eleft[left(fracpartialpartial theta log f(x;theta)right)^2left|thetaright.right]$ where $f$ is the distribution of $x$ with parameter $theta$. You know that $x$ is Poisson with mean $mu_x(theta)$.
    $endgroup$
    – Augusto S.
    Mar 24 at 23:39










  • $begingroup$
    Can you develop the expression above? $x$ is Poisson and hence, $f(x;theta)=e^-mu_x(theta)frac(mu_x(theta))^xx!$. Can you: (i) compute the log of this expression; (ii) compute the derivative with respect to $theta$ (I am assuming that $mu_x(theta)$ is differentiable); (iii) use the property for conditional expectations: $Eleft[Xg(Z)left|Zright.right]=g(Z)E[Xleft|Zright.]$ to further simplify the expression. See what is the expression that you get -- it will be given in terms of the derivative of $mu_x(theta)$ and $mu_x(theta)$ itself. Let me know if you have questions.
    $endgroup$
    – Augusto S.
    Mar 24 at 23:47











  • $begingroup$
    while this is true, you did helped me a lot with the homework, and I did learn much for that. I tried myself to solve it for days, so I did not had much choice. thank you again, everything is much clear now :)
    $endgroup$
    – segevp
    Mar 25 at 6:38






  • 1




    $begingroup$
    I am glad it helped!
    $endgroup$
    – Augusto S.
    Mar 25 at 7:07
















2





+100







$begingroup$

(1) The MMSE is given by $widehaty=Eleft[y,middle|~xright]$ and the equation $y=x+n$ yields $$Eleft[y ,middle|~ x right] = Eleft[x ,middle|~ x right]+Eleft[n ,middle|~ x right] = x + E[n] = x+mu_n~,$$
where the second identity holds in view of the independence of $x$ and $n$.



In other words, $widehaty=x+mu_n$.



(2) The MMSE is given by $widehatx=Eleft[ x ,middle|~ y right]$. The relation $y=x+n$ yields $Eleft[x ,middle|~ y right] = y-Eleft[n ,middle|~ y right]$. As shown below, we have $Eleft[n ,middle|~ y right] = yleft(fracmu_nmu_n+mu_xright)$.



Therefore, $widehatx=yleft(fracmu_xmu_n+mu_xright)$.




We now show that $Eleft[n ,middle|~ y right]=yleft(fracmu_nmu_n+mu_xright)$. We have



$$Eleft[n ,middle|~ y=y' right]=sum_kleq y'kmathbbPleft(n=kleft|y=y'right.right)~,$$



and therefore



beginalign
mathbbPleft(n=k ,middle|~
y=y' right) & =fracmathbbPleft(n=k,,y=y'right)mathbbPleft(y=y'right) \
&=fracmathbbPleft(n=k,,x=y'-kright)mathbbPleft(y=y'right) \
&=fracmathbbPleft(n=kright)mathbbPleft(x=y'-kright)mathbbPleft(y=y'right) \
&= fracleft(e^-mu_nmu_n^k/k!right)left(e^-mu_xmu_x^y'-k/(y'-k)!right)left(e^-left(mu_n+mu_xright)left(mu_n+mu_xright)^y'/y'!right)=left(beginarrayc y' \ kendarrayright) left(fracmu_nmu_n+mu_xright)^kleft(fracmu_xmu_n+mu_xright)^y'-k~,
endalign



where the second to last equality above holds since $ysim sf Poisson(mu_n+mu_x)$, which follows from the independence of $x$ and $n$. In other words, for $y'$ fixed, $left(mathbbPleft(n=k ,middle|~y=y' right)right)_kleq y'$ is $sf Binomial(y',p)$ with $p= mu_n ,/(mu_n+mu_x)$ and thus



$$Eleft[n ,middle| y=y' right] = py' = y'left(fracmu_nmu_n+mu_xright)$$



That is, $Eleft[n ,middle| y right]=yleft(fracmu_nmu_n+mu_xright)$.






share|cite|improve this answer











$endgroup$








  • 1




    $begingroup$
    That is a great answer! thank you so much. If I may, I have another question related(this is where things get tricky!). If $mu_x=mu_x(theta)$, how would you calculate the fisher information of $x$ on $theta$, $J_x(theta)$?
    $endgroup$
    – segevp
    Mar 24 at 22:49










  • $begingroup$
    First, let me say that by reading the question again, I realized that this was a homework problem and that you were asking more for hints and references (so I should not have placed a full solution to your question!). Regarding the Fischer information, let us proceed by degrees. You have the expression for the Fischer information given by $mathcalJ_x(theta)=Eleft[left(fracpartialpartial theta log f(x;theta)right)^2left|thetaright.right]$ where $f$ is the distribution of $x$ with parameter $theta$. You know that $x$ is Poisson with mean $mu_x(theta)$.
    $endgroup$
    – Augusto S.
    Mar 24 at 23:39










  • $begingroup$
    Can you develop the expression above? $x$ is Poisson and hence, $f(x;theta)=e^-mu_x(theta)frac(mu_x(theta))^xx!$. Can you: (i) compute the log of this expression; (ii) compute the derivative with respect to $theta$ (I am assuming that $mu_x(theta)$ is differentiable); (iii) use the property for conditional expectations: $Eleft[Xg(Z)left|Zright.right]=g(Z)E[Xleft|Zright.]$ to further simplify the expression. See what is the expression that you get -- it will be given in terms of the derivative of $mu_x(theta)$ and $mu_x(theta)$ itself. Let me know if you have questions.
    $endgroup$
    – Augusto S.
    Mar 24 at 23:47











  • $begingroup$
    while this is true, you did helped me a lot with the homework, and I did learn much for that. I tried myself to solve it for days, so I did not had much choice. thank you again, everything is much clear now :)
    $endgroup$
    – segevp
    Mar 25 at 6:38






  • 1




    $begingroup$
    I am glad it helped!
    $endgroup$
    – Augusto S.
    Mar 25 at 7:07














2





+100







2





+100



2




+100



$begingroup$

(1) The MMSE is given by $widehaty=Eleft[y,middle|~xright]$ and the equation $y=x+n$ yields $$Eleft[y ,middle|~ x right] = Eleft[x ,middle|~ x right]+Eleft[n ,middle|~ x right] = x + E[n] = x+mu_n~,$$
where the second identity holds in view of the independence of $x$ and $n$.



In other words, $widehaty=x+mu_n$.



(2) The MMSE is given by $widehatx=Eleft[ x ,middle|~ y right]$. The relation $y=x+n$ yields $Eleft[x ,middle|~ y right] = y-Eleft[n ,middle|~ y right]$. As shown below, we have $Eleft[n ,middle|~ y right] = yleft(fracmu_nmu_n+mu_xright)$.



Therefore, $widehatx=yleft(fracmu_xmu_n+mu_xright)$.




We now show that $Eleft[n ,middle|~ y right]=yleft(fracmu_nmu_n+mu_xright)$. We have



$$Eleft[n ,middle|~ y=y' right]=sum_kleq y'kmathbbPleft(n=kleft|y=y'right.right)~,$$



and therefore



beginalign
mathbbPleft(n=k ,middle|~
y=y' right) & =fracmathbbPleft(n=k,,y=y'right)mathbbPleft(y=y'right) \
&=fracmathbbPleft(n=k,,x=y'-kright)mathbbPleft(y=y'right) \
&=fracmathbbPleft(n=kright)mathbbPleft(x=y'-kright)mathbbPleft(y=y'right) \
&= fracleft(e^-mu_nmu_n^k/k!right)left(e^-mu_xmu_x^y'-k/(y'-k)!right)left(e^-left(mu_n+mu_xright)left(mu_n+mu_xright)^y'/y'!right)=left(beginarrayc y' \ kendarrayright) left(fracmu_nmu_n+mu_xright)^kleft(fracmu_xmu_n+mu_xright)^y'-k~,
endalign



where the second to last equality above holds since $ysim sf Poisson(mu_n+mu_x)$, which follows from the independence of $x$ and $n$. In other words, for $y'$ fixed, $left(mathbbPleft(n=k ,middle|~y=y' right)right)_kleq y'$ is $sf Binomial(y',p)$ with $p= mu_n ,/(mu_n+mu_x)$ and thus



$$Eleft[n ,middle| y=y' right] = py' = y'left(fracmu_nmu_n+mu_xright)$$



That is, $Eleft[n ,middle| y right]=yleft(fracmu_nmu_n+mu_xright)$.






share|cite|improve this answer











$endgroup$



(1) The MMSE is given by $widehaty=Eleft[y,middle|~xright]$ and the equation $y=x+n$ yields $$Eleft[y ,middle|~ x right] = Eleft[x ,middle|~ x right]+Eleft[n ,middle|~ x right] = x + E[n] = x+mu_n~,$$
where the second identity holds in view of the independence of $x$ and $n$.



In other words, $widehaty=x+mu_n$.



(2) The MMSE is given by $widehatx=Eleft[ x ,middle|~ y right]$. The relation $y=x+n$ yields $Eleft[x ,middle|~ y right] = y-Eleft[n ,middle|~ y right]$. As shown below, we have $Eleft[n ,middle|~ y right] = yleft(fracmu_nmu_n+mu_xright)$.



Therefore, $widehatx=yleft(fracmu_xmu_n+mu_xright)$.




We now show that $Eleft[n ,middle|~ y right]=yleft(fracmu_nmu_n+mu_xright)$. We have



$$Eleft[n ,middle|~ y=y' right]=sum_kleq y'kmathbbPleft(n=kleft|y=y'right.right)~,$$



and therefore



beginalign
mathbbPleft(n=k ,middle|~
y=y' right) & =fracmathbbPleft(n=k,,y=y'right)mathbbPleft(y=y'right) \
&=fracmathbbPleft(n=k,,x=y'-kright)mathbbPleft(y=y'right) \
&=fracmathbbPleft(n=kright)mathbbPleft(x=y'-kright)mathbbPleft(y=y'right) \
&= fracleft(e^-mu_nmu_n^k/k!right)left(e^-mu_xmu_x^y'-k/(y'-k)!right)left(e^-left(mu_n+mu_xright)left(mu_n+mu_xright)^y'/y'!right)=left(beginarrayc y' \ kendarrayright) left(fracmu_nmu_n+mu_xright)^kleft(fracmu_xmu_n+mu_xright)^y'-k~,
endalign



where the second to last equality above holds since $ysim sf Poisson(mu_n+mu_x)$, which follows from the independence of $x$ and $n$. In other words, for $y'$ fixed, $left(mathbbPleft(n=k ,middle|~y=y' right)right)_kleq y'$ is $sf Binomial(y',p)$ with $p= mu_n ,/(mu_n+mu_x)$ and thus



$$Eleft[n ,middle| y=y' right] = py' = y'left(fracmu_nmu_n+mu_xright)$$



That is, $Eleft[n ,middle| y right]=yleft(fracmu_nmu_n+mu_xright)$.







share|cite|improve this answer














share|cite|improve this answer



share|cite|improve this answer








edited Mar 25 at 22:29









Lee David Chung Lin

4,47841242




4,47841242










answered Mar 24 at 20:51









Augusto S.Augusto S.

69649




69649







  • 1




    $begingroup$
    That is a great answer! thank you so much. If I may, I have another question related(this is where things get tricky!). If $mu_x=mu_x(theta)$, how would you calculate the fisher information of $x$ on $theta$, $J_x(theta)$?
    $endgroup$
    – segevp
    Mar 24 at 22:49










  • $begingroup$
    First, let me say that by reading the question again, I realized that this was a homework problem and that you were asking more for hints and references (so I should not have placed a full solution to your question!). Regarding the Fischer information, let us proceed by degrees. You have the expression for the Fischer information given by $mathcalJ_x(theta)=Eleft[left(fracpartialpartial theta log f(x;theta)right)^2left|thetaright.right]$ where $f$ is the distribution of $x$ with parameter $theta$. You know that $x$ is Poisson with mean $mu_x(theta)$.
    $endgroup$
    – Augusto S.
    Mar 24 at 23:39










  • $begingroup$
    Can you develop the expression above? $x$ is Poisson and hence, $f(x;theta)=e^-mu_x(theta)frac(mu_x(theta))^xx!$. Can you: (i) compute the log of this expression; (ii) compute the derivative with respect to $theta$ (I am assuming that $mu_x(theta)$ is differentiable); (iii) use the property for conditional expectations: $Eleft[Xg(Z)left|Zright.right]=g(Z)E[Xleft|Zright.]$ to further simplify the expression. See what is the expression that you get -- it will be given in terms of the derivative of $mu_x(theta)$ and $mu_x(theta)$ itself. Let me know if you have questions.
    $endgroup$
    – Augusto S.
    Mar 24 at 23:47











  • $begingroup$
    while this is true, you did helped me a lot with the homework, and I did learn much for that. I tried myself to solve it for days, so I did not had much choice. thank you again, everything is much clear now :)
    $endgroup$
    – segevp
    Mar 25 at 6:38






  • 1




    $begingroup$
    I am glad it helped!
    $endgroup$
    – Augusto S.
    Mar 25 at 7:07













  • 1




    $begingroup$
    That is a great answer! thank you so much. If I may, I have another question related(this is where things get tricky!). If $mu_x=mu_x(theta)$, how would you calculate the fisher information of $x$ on $theta$, $J_x(theta)$?
    $endgroup$
    – segevp
    Mar 24 at 22:49










  • $begingroup$
    First, let me say that by reading the question again, I realized that this was a homework problem and that you were asking more for hints and references (so I should not have placed a full solution to your question!). Regarding the Fischer information, let us proceed by degrees. You have the expression for the Fischer information given by $mathcalJ_x(theta)=Eleft[left(fracpartialpartial theta log f(x;theta)right)^2left|thetaright.right]$ where $f$ is the distribution of $x$ with parameter $theta$. You know that $x$ is Poisson with mean $mu_x(theta)$.
    $endgroup$
    – Augusto S.
    Mar 24 at 23:39










  • $begingroup$
    Can you develop the expression above? $x$ is Poisson and hence, $f(x;theta)=e^-mu_x(theta)frac(mu_x(theta))^xx!$. Can you: (i) compute the log of this expression; (ii) compute the derivative with respect to $theta$ (I am assuming that $mu_x(theta)$ is differentiable); (iii) use the property for conditional expectations: $Eleft[Xg(Z)left|Zright.right]=g(Z)E[Xleft|Zright.]$ to further simplify the expression. See what is the expression that you get -- it will be given in terms of the derivative of $mu_x(theta)$ and $mu_x(theta)$ itself. Let me know if you have questions.
    $endgroup$
    – Augusto S.
    Mar 24 at 23:47











  • $begingroup$
    while this is true, you did helped me a lot with the homework, and I did learn much for that. I tried myself to solve it for days, so I did not had much choice. thank you again, everything is much clear now :)
    $endgroup$
    – segevp
    Mar 25 at 6:38






  • 1




    $begingroup$
    I am glad it helped!
    $endgroup$
    – Augusto S.
    Mar 25 at 7:07








1




1




$begingroup$
That is a great answer! thank you so much. If I may, I have another question related(this is where things get tricky!). If $mu_x=mu_x(theta)$, how would you calculate the fisher information of $x$ on $theta$, $J_x(theta)$?
$endgroup$
– segevp
Mar 24 at 22:49




$begingroup$
That is a great answer! thank you so much. If I may, I have another question related(this is where things get tricky!). If $mu_x=mu_x(theta)$, how would you calculate the fisher information of $x$ on $theta$, $J_x(theta)$?
$endgroup$
– segevp
Mar 24 at 22:49












$begingroup$
First, let me say that by reading the question again, I realized that this was a homework problem and that you were asking more for hints and references (so I should not have placed a full solution to your question!). Regarding the Fischer information, let us proceed by degrees. You have the expression for the Fischer information given by $mathcalJ_x(theta)=Eleft[left(fracpartialpartial theta log f(x;theta)right)^2left|thetaright.right]$ where $f$ is the distribution of $x$ with parameter $theta$. You know that $x$ is Poisson with mean $mu_x(theta)$.
$endgroup$
– Augusto S.
Mar 24 at 23:39




$begingroup$
First, let me say that by reading the question again, I realized that this was a homework problem and that you were asking more for hints and references (so I should not have placed a full solution to your question!). Regarding the Fischer information, let us proceed by degrees. You have the expression for the Fischer information given by $mathcalJ_x(theta)=Eleft[left(fracpartialpartial theta log f(x;theta)right)^2left|thetaright.right]$ where $f$ is the distribution of $x$ with parameter $theta$. You know that $x$ is Poisson with mean $mu_x(theta)$.
$endgroup$
– Augusto S.
Mar 24 at 23:39












$begingroup$
Can you develop the expression above? $x$ is Poisson and hence, $f(x;theta)=e^-mu_x(theta)frac(mu_x(theta))^xx!$. Can you: (i) compute the log of this expression; (ii) compute the derivative with respect to $theta$ (I am assuming that $mu_x(theta)$ is differentiable); (iii) use the property for conditional expectations: $Eleft[Xg(Z)left|Zright.right]=g(Z)E[Xleft|Zright.]$ to further simplify the expression. See what is the expression that you get -- it will be given in terms of the derivative of $mu_x(theta)$ and $mu_x(theta)$ itself. Let me know if you have questions.
$endgroup$
– Augusto S.
Mar 24 at 23:47





$begingroup$
Can you develop the expression above? $x$ is Poisson and hence, $f(x;theta)=e^-mu_x(theta)frac(mu_x(theta))^xx!$. Can you: (i) compute the log of this expression; (ii) compute the derivative with respect to $theta$ (I am assuming that $mu_x(theta)$ is differentiable); (iii) use the property for conditional expectations: $Eleft[Xg(Z)left|Zright.right]=g(Z)E[Xleft|Zright.]$ to further simplify the expression. See what is the expression that you get -- it will be given in terms of the derivative of $mu_x(theta)$ and $mu_x(theta)$ itself. Let me know if you have questions.
$endgroup$
– Augusto S.
Mar 24 at 23:47













$begingroup$
while this is true, you did helped me a lot with the homework, and I did learn much for that. I tried myself to solve it for days, so I did not had much choice. thank you again, everything is much clear now :)
$endgroup$
– segevp
Mar 25 at 6:38




$begingroup$
while this is true, you did helped me a lot with the homework, and I did learn much for that. I tried myself to solve it for days, so I did not had much choice. thank you again, everything is much clear now :)
$endgroup$
– segevp
Mar 25 at 6:38




1




1




$begingroup$
I am glad it helped!
$endgroup$
– Augusto S.
Mar 25 at 7:07





$begingroup$
I am glad it helped!
$endgroup$
– Augusto S.
Mar 25 at 7:07


















draft saved

draft discarded
















































Thanks for contributing an answer to Mathematics Stack Exchange!


  • Please be sure to answer the question. Provide details and share your research!

But avoid


  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.

Use MathJax to format equations. MathJax reference.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3157230%2foptimal-estimator-of-poisson-distribution%23new-answer', 'question_page');

);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

Solar Wings Breeze Design and development Specifications (Breeze) References Navigation menu1368-485X"Hang glider: Breeze (Solar Wings)"e

Kathakali Contents Etymology and nomenclature History Repertoire Songs and musical instruments Traditional plays Styles: Sampradayam Training centers and awards Relationship to other dance forms See also Notes References External links Navigation menueThe Illustrated Encyclopedia of Hinduism: A-MSouth Asian Folklore: An EncyclopediaRoutledge International Encyclopedia of Women: Global Women's Issues and KnowledgeKathakali Dance-drama: Where Gods and Demons Come to PlayKathakali Dance-drama: Where Gods and Demons Come to PlayKathakali Dance-drama: Where Gods and Demons Come to Play10.1353/atj.2005.0004The Illustrated Encyclopedia of Hinduism: A-MEncyclopedia of HinduismKathakali Dance-drama: Where Gods and Demons Come to PlaySonic Liturgy: Ritual and Music in Hindu Tradition"The Mirror of Gesture"Kathakali Dance-drama: Where Gods and Demons Come to Play"Kathakali"Indian Theatre: Traditions of PerformanceIndian Theatre: Traditions of PerformanceIndian Theatre: Traditions of PerformanceIndian Theatre: Traditions of PerformanceMedieval Indian Literature: An AnthologyThe Oxford Companion to Indian TheatreSouth Asian Folklore: An Encyclopedia : Afghanistan, Bangladesh, India, Nepal, Pakistan, Sri LankaThe Rise of Performance Studies: Rethinking Richard Schechner's Broad SpectrumIndian Theatre: Traditions of PerformanceModern Asian Theatre and Performance 1900-2000Critical Theory and PerformanceBetween Theater and AnthropologyKathakali603847011Indian Theatre: Traditions of PerformanceIndian Theatre: Traditions of PerformanceIndian Theatre: Traditions of PerformanceBetween Theater and AnthropologyBetween Theater and AnthropologyNambeesan Smaraka AwardsArchivedThe Cambridge Guide to TheatreRoutledge International Encyclopedia of Women: Global Women's Issues and KnowledgeThe Garland Encyclopedia of World Music: South Asia : the Indian subcontinentThe Ethos of Noh: Actors and Their Art10.2307/1145740By Means of Performance: Intercultural Studies of Theatre and Ritual10.1017/s204912550000100xReconceiving the Renaissance: A Critical ReaderPerformance TheoryListening to Theatre: The Aural Dimension of Beijing Opera10.2307/1146013Kathakali: The Art of the Non-WorldlyOn KathakaliKathakali, the dance theatreThe Kathakali Complex: Performance & StructureKathakali Dance-Drama: Where Gods and Demons Come to Play10.1093/obo/9780195399318-0071Drama and Ritual of Early Hinduism"In the Shadow of Hollywood Orientalism: Authentic East Indian Dancing"10.1080/08949460490274013Sanskrit Play Production in Ancient IndiaIndian Music: History and StructureBharata, the Nāṭyaśāstra233639306Table of Contents2238067286469807Dance In Indian Painting10.2307/32047833204783Kathakali Dance-Theatre: A Visual Narrative of Sacred Indian MimeIndian Classical Dance: The Renaissance and BeyondKathakali: an indigenous art-form of Keralaeee

Urgehal History Discography Band members References External links Navigation menu"Mediateket: Urgehal""Interview with Enzifer of Urgehal, 2007""Urgehal - Interview"Urgehal"Urgehal Frontman Trondr Nefas Dies at 35"Urgehal9042691cb161873230(data)0000 0001 0669 4224no2016126817ee6ccef6-e558-44b6-b059-dbbb5b913b24145036459145036459