Optimal estimator of Poisson distributionWhat probability distribution is this?Bounds on least squares and weighted least squares estimatorFind the maximum of an integral function with respect to another functionGiven a Poisson-noisy signal, what is the noise distribution of its Fourier transform?Optimal estimation of the fusion of two measurementsComputing mean square error for linear transformThe distribution of a complex signalproblems and solutions for simple and harder optimal control theory?2d bimodal distribution filterCalculating the Mean Square Error (MSE) in Wavelet Denoising

Why does Kotter return in Welcome Back Kotter?

Why can't we play rap on piano?

Are the number of citations and number of published articles the most important criteria for a tenure promotion?

Rock identification in KY

Malformed Address '10.10.21.08/24', must be X.X.X.X/NN or

tikz convert color string to hex value

What would happen to a modern skyscraper if it rains micro blackholes?

Can a vampire attack twice with their claws using Multiattack?

What is a clear way to write a bar that has an extra beat?

Are astronomers waiting to see something in an image from a gravitational lens that they've already seen in an adjacent image?

Why is Minecraft giving an OpenGL error?

How does quantile regression compare to logistic regression with the variable split at the quantile?

Why are electrically insulating heatsinks so rare? Is it just cost?

How can I make my BBEG immortal short of making them a Lich or Vampire?

Is it inappropriate for a student to attend their mentor's dissertation defense?

Maximum likelihood parameters deviate from posterior distributions

LaTeX: Why are digits allowed in environments, but forbidden in commands?

Paid for article while in US on F-1 visa?

Add text to same line using sed

Is it possible to do 50 km distance without any previous training?

Intersection point of 2 lines defined by 2 points each

What typically incentivizes a professor to change jobs to a lower ranking university?

Can I ask the recruiters in my resume to put the reason why I am rejected?

Convert two switches to a dual stack, and add outlet - possible here?



Optimal estimator of Poisson distribution


What probability distribution is this?Bounds on least squares and weighted least squares estimatorFind the maximum of an integral function with respect to another functionGiven a Poisson-noisy signal, what is the noise distribution of its Fourier transform?Optimal estimation of the fusion of two measurementsComputing mean square error for linear transformThe distribution of a complex signalproblems and solutions for simple and harder optimal control theory?2d bimodal distribution filterCalculating the Mean Square Error (MSE) in Wavelet Denoising













2












$begingroup$


As part of my homework, I was asked to solve the following:




a signal and noise are stochastically independent and it is known that $nsim Poiss(mu_n)$ and $xsim Poiss(mu_x)$ we define $y=x+n$



  1. Find the optimal estimator(MMSE) of $y$ given $x$.

  2. Find the linear optimal estimator(MMSE) of $x$ given $y$.



I'm not familiar with optimal estimators, and I couldn't find a good source to study it.



Any help will be appreciated.










share|cite|improve this question











$endgroup$











  • $begingroup$
    Since you mention MMSE, perhaps going through the wiki page might help.
    $endgroup$
    – StubbornAtom
    Mar 21 at 19:12















2












$begingroup$


As part of my homework, I was asked to solve the following:




a signal and noise are stochastically independent and it is known that $nsim Poiss(mu_n)$ and $xsim Poiss(mu_x)$ we define $y=x+n$



  1. Find the optimal estimator(MMSE) of $y$ given $x$.

  2. Find the linear optimal estimator(MMSE) of $x$ given $y$.



I'm not familiar with optimal estimators, and I couldn't find a good source to study it.



Any help will be appreciated.










share|cite|improve this question











$endgroup$











  • $begingroup$
    Since you mention MMSE, perhaps going through the wiki page might help.
    $endgroup$
    – StubbornAtom
    Mar 21 at 19:12













2












2








2


2



$begingroup$


As part of my homework, I was asked to solve the following:




a signal and noise are stochastically independent and it is known that $nsim Poiss(mu_n)$ and $xsim Poiss(mu_x)$ we define $y=x+n$



  1. Find the optimal estimator(MMSE) of $y$ given $x$.

  2. Find the linear optimal estimator(MMSE) of $x$ given $y$.



I'm not familiar with optimal estimators, and I couldn't find a good source to study it.



Any help will be appreciated.










share|cite|improve this question











$endgroup$




As part of my homework, I was asked to solve the following:




a signal and noise are stochastically independent and it is known that $nsim Poiss(mu_n)$ and $xsim Poiss(mu_x)$ we define $y=x+n$



  1. Find the optimal estimator(MMSE) of $y$ given $x$.

  2. Find the linear optimal estimator(MMSE) of $x$ given $y$.



I'm not familiar with optimal estimators, and I couldn't find a good source to study it.



Any help will be appreciated.







signal-processing online-resources






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Mar 24 at 15:36







segevp

















asked Mar 21 at 18:44









segevpsegevp

508621




508621











  • $begingroup$
    Since you mention MMSE, perhaps going through the wiki page might help.
    $endgroup$
    – StubbornAtom
    Mar 21 at 19:12
















  • $begingroup$
    Since you mention MMSE, perhaps going through the wiki page might help.
    $endgroup$
    – StubbornAtom
    Mar 21 at 19:12















$begingroup$
Since you mention MMSE, perhaps going through the wiki page might help.
$endgroup$
– StubbornAtom
Mar 21 at 19:12




$begingroup$
Since you mention MMSE, perhaps going through the wiki page might help.
$endgroup$
– StubbornAtom
Mar 21 at 19:12










1 Answer
1






active

oldest

votes


















2





+100







$begingroup$

(1) The MMSE is given by $widehaty=Eleft[y,middle|~xright]$ and the equation $y=x+n$ yields $$Eleft[y ,middle|~ x right] = Eleft[x ,middle|~ x right]+Eleft[n ,middle|~ x right] = x + E[n] = x+mu_n~,$$
where the second identity holds in view of the independence of $x$ and $n$.



In other words, $widehaty=x+mu_n$.



(2) The MMSE is given by $widehatx=Eleft[ x ,middle|~ y right]$. The relation $y=x+n$ yields $Eleft[x ,middle|~ y right] = y-Eleft[n ,middle|~ y right]$. As shown below, we have $Eleft[n ,middle|~ y right] = yleft(fracmu_nmu_n+mu_xright)$.



Therefore, $widehatx=yleft(fracmu_xmu_n+mu_xright)$.




We now show that $Eleft[n ,middle|~ y right]=yleft(fracmu_nmu_n+mu_xright)$. We have



$$Eleft[n ,middle|~ y=y' right]=sum_kleq y'kmathbbPleft(n=kleft|y=y'right.right)~,$$



and therefore



beginalign
mathbbPleft(n=k ,middle|~
y=y' right) & =fracmathbbPleft(n=k,,y=y'right)mathbbPleft(y=y'right) \
&=fracmathbbPleft(n=k,,x=y'-kright)mathbbPleft(y=y'right) \
&=fracmathbbPleft(n=kright)mathbbPleft(x=y'-kright)mathbbPleft(y=y'right) \
&= fracleft(e^-mu_nmu_n^k/k!right)left(e^-mu_xmu_x^y'-k/(y'-k)!right)left(e^-left(mu_n+mu_xright)left(mu_n+mu_xright)^y'/y'!right)=left(beginarrayc y' \ kendarrayright) left(fracmu_nmu_n+mu_xright)^kleft(fracmu_xmu_n+mu_xright)^y'-k~,
endalign



where the second to last equality above holds since $ysim sf Poisson(mu_n+mu_x)$, which follows from the independence of $x$ and $n$. In other words, for $y'$ fixed, $left(mathbbPleft(n=k ,middle|~y=y' right)right)_kleq y'$ is $sf Binomial(y',p)$ with $p= mu_n ,/(mu_n+mu_x)$ and thus



$$Eleft[n ,middle| y=y' right] = py' = y'left(fracmu_nmu_n+mu_xright)$$



That is, $Eleft[n ,middle| y right]=yleft(fracmu_nmu_n+mu_xright)$.






share|cite|improve this answer











$endgroup$








  • 1




    $begingroup$
    That is a great answer! thank you so much. If I may, I have another question related(this is where things get tricky!). If $mu_x=mu_x(theta)$, how would you calculate the fisher information of $x$ on $theta$, $J_x(theta)$?
    $endgroup$
    – segevp
    Mar 24 at 22:49










  • $begingroup$
    First, let me say that by reading the question again, I realized that this was a homework problem and that you were asking more for hints and references (so I should not have placed a full solution to your question!). Regarding the Fischer information, let us proceed by degrees. You have the expression for the Fischer information given by $mathcalJ_x(theta)=Eleft[left(fracpartialpartial theta log f(x;theta)right)^2left|thetaright.right]$ where $f$ is the distribution of $x$ with parameter $theta$. You know that $x$ is Poisson with mean $mu_x(theta)$.
    $endgroup$
    – Augusto S.
    Mar 24 at 23:39










  • $begingroup$
    Can you develop the expression above? $x$ is Poisson and hence, $f(x;theta)=e^-mu_x(theta)frac(mu_x(theta))^xx!$. Can you: (i) compute the log of this expression; (ii) compute the derivative with respect to $theta$ (I am assuming that $mu_x(theta)$ is differentiable); (iii) use the property for conditional expectations: $Eleft[Xg(Z)left|Zright.right]=g(Z)E[Xleft|Zright.]$ to further simplify the expression. See what is the expression that you get -- it will be given in terms of the derivative of $mu_x(theta)$ and $mu_x(theta)$ itself. Let me know if you have questions.
    $endgroup$
    – Augusto S.
    Mar 24 at 23:47











  • $begingroup$
    while this is true, you did helped me a lot with the homework, and I did learn much for that. I tried myself to solve it for days, so I did not had much choice. thank you again, everything is much clear now :)
    $endgroup$
    – segevp
    Mar 25 at 6:38






  • 1




    $begingroup$
    I am glad it helped!
    $endgroup$
    – Augusto S.
    Mar 25 at 7:07












Your Answer





StackExchange.ifUsing("editor", function ()
return StackExchange.using("mathjaxEditing", function ()
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
);
);
, "mathjax-editing");

StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "69"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);

else
createEditor();

);

function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);



);













draft saved

draft discarded


















StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3157230%2foptimal-estimator-of-poisson-distribution%23new-answer', 'question_page');

);

Post as a guest















Required, but never shown

























1 Answer
1






active

oldest

votes








1 Answer
1






active

oldest

votes









active

oldest

votes






active

oldest

votes









2





+100







$begingroup$

(1) The MMSE is given by $widehaty=Eleft[y,middle|~xright]$ and the equation $y=x+n$ yields $$Eleft[y ,middle|~ x right] = Eleft[x ,middle|~ x right]+Eleft[n ,middle|~ x right] = x + E[n] = x+mu_n~,$$
where the second identity holds in view of the independence of $x$ and $n$.



In other words, $widehaty=x+mu_n$.



(2) The MMSE is given by $widehatx=Eleft[ x ,middle|~ y right]$. The relation $y=x+n$ yields $Eleft[x ,middle|~ y right] = y-Eleft[n ,middle|~ y right]$. As shown below, we have $Eleft[n ,middle|~ y right] = yleft(fracmu_nmu_n+mu_xright)$.



Therefore, $widehatx=yleft(fracmu_xmu_n+mu_xright)$.




We now show that $Eleft[n ,middle|~ y right]=yleft(fracmu_nmu_n+mu_xright)$. We have



$$Eleft[n ,middle|~ y=y' right]=sum_kleq y'kmathbbPleft(n=kleft|y=y'right.right)~,$$



and therefore



beginalign
mathbbPleft(n=k ,middle|~
y=y' right) & =fracmathbbPleft(n=k,,y=y'right)mathbbPleft(y=y'right) \
&=fracmathbbPleft(n=k,,x=y'-kright)mathbbPleft(y=y'right) \
&=fracmathbbPleft(n=kright)mathbbPleft(x=y'-kright)mathbbPleft(y=y'right) \
&= fracleft(e^-mu_nmu_n^k/k!right)left(e^-mu_xmu_x^y'-k/(y'-k)!right)left(e^-left(mu_n+mu_xright)left(mu_n+mu_xright)^y'/y'!right)=left(beginarrayc y' \ kendarrayright) left(fracmu_nmu_n+mu_xright)^kleft(fracmu_xmu_n+mu_xright)^y'-k~,
endalign



where the second to last equality above holds since $ysim sf Poisson(mu_n+mu_x)$, which follows from the independence of $x$ and $n$. In other words, for $y'$ fixed, $left(mathbbPleft(n=k ,middle|~y=y' right)right)_kleq y'$ is $sf Binomial(y',p)$ with $p= mu_n ,/(mu_n+mu_x)$ and thus



$$Eleft[n ,middle| y=y' right] = py' = y'left(fracmu_nmu_n+mu_xright)$$



That is, $Eleft[n ,middle| y right]=yleft(fracmu_nmu_n+mu_xright)$.






share|cite|improve this answer











$endgroup$








  • 1




    $begingroup$
    That is a great answer! thank you so much. If I may, I have another question related(this is where things get tricky!). If $mu_x=mu_x(theta)$, how would you calculate the fisher information of $x$ on $theta$, $J_x(theta)$?
    $endgroup$
    – segevp
    Mar 24 at 22:49










  • $begingroup$
    First, let me say that by reading the question again, I realized that this was a homework problem and that you were asking more for hints and references (so I should not have placed a full solution to your question!). Regarding the Fischer information, let us proceed by degrees. You have the expression for the Fischer information given by $mathcalJ_x(theta)=Eleft[left(fracpartialpartial theta log f(x;theta)right)^2left|thetaright.right]$ where $f$ is the distribution of $x$ with parameter $theta$. You know that $x$ is Poisson with mean $mu_x(theta)$.
    $endgroup$
    – Augusto S.
    Mar 24 at 23:39










  • $begingroup$
    Can you develop the expression above? $x$ is Poisson and hence, $f(x;theta)=e^-mu_x(theta)frac(mu_x(theta))^xx!$. Can you: (i) compute the log of this expression; (ii) compute the derivative with respect to $theta$ (I am assuming that $mu_x(theta)$ is differentiable); (iii) use the property for conditional expectations: $Eleft[Xg(Z)left|Zright.right]=g(Z)E[Xleft|Zright.]$ to further simplify the expression. See what is the expression that you get -- it will be given in terms of the derivative of $mu_x(theta)$ and $mu_x(theta)$ itself. Let me know if you have questions.
    $endgroup$
    – Augusto S.
    Mar 24 at 23:47











  • $begingroup$
    while this is true, you did helped me a lot with the homework, and I did learn much for that. I tried myself to solve it for days, so I did not had much choice. thank you again, everything is much clear now :)
    $endgroup$
    – segevp
    Mar 25 at 6:38






  • 1




    $begingroup$
    I am glad it helped!
    $endgroup$
    – Augusto S.
    Mar 25 at 7:07
















2





+100







$begingroup$

(1) The MMSE is given by $widehaty=Eleft[y,middle|~xright]$ and the equation $y=x+n$ yields $$Eleft[y ,middle|~ x right] = Eleft[x ,middle|~ x right]+Eleft[n ,middle|~ x right] = x + E[n] = x+mu_n~,$$
where the second identity holds in view of the independence of $x$ and $n$.



In other words, $widehaty=x+mu_n$.



(2) The MMSE is given by $widehatx=Eleft[ x ,middle|~ y right]$. The relation $y=x+n$ yields $Eleft[x ,middle|~ y right] = y-Eleft[n ,middle|~ y right]$. As shown below, we have $Eleft[n ,middle|~ y right] = yleft(fracmu_nmu_n+mu_xright)$.



Therefore, $widehatx=yleft(fracmu_xmu_n+mu_xright)$.




We now show that $Eleft[n ,middle|~ y right]=yleft(fracmu_nmu_n+mu_xright)$. We have



$$Eleft[n ,middle|~ y=y' right]=sum_kleq y'kmathbbPleft(n=kleft|y=y'right.right)~,$$



and therefore



beginalign
mathbbPleft(n=k ,middle|~
y=y' right) & =fracmathbbPleft(n=k,,y=y'right)mathbbPleft(y=y'right) \
&=fracmathbbPleft(n=k,,x=y'-kright)mathbbPleft(y=y'right) \
&=fracmathbbPleft(n=kright)mathbbPleft(x=y'-kright)mathbbPleft(y=y'right) \
&= fracleft(e^-mu_nmu_n^k/k!right)left(e^-mu_xmu_x^y'-k/(y'-k)!right)left(e^-left(mu_n+mu_xright)left(mu_n+mu_xright)^y'/y'!right)=left(beginarrayc y' \ kendarrayright) left(fracmu_nmu_n+mu_xright)^kleft(fracmu_xmu_n+mu_xright)^y'-k~,
endalign



where the second to last equality above holds since $ysim sf Poisson(mu_n+mu_x)$, which follows from the independence of $x$ and $n$. In other words, for $y'$ fixed, $left(mathbbPleft(n=k ,middle|~y=y' right)right)_kleq y'$ is $sf Binomial(y',p)$ with $p= mu_n ,/(mu_n+mu_x)$ and thus



$$Eleft[n ,middle| y=y' right] = py' = y'left(fracmu_nmu_n+mu_xright)$$



That is, $Eleft[n ,middle| y right]=yleft(fracmu_nmu_n+mu_xright)$.






share|cite|improve this answer











$endgroup$








  • 1




    $begingroup$
    That is a great answer! thank you so much. If I may, I have another question related(this is where things get tricky!). If $mu_x=mu_x(theta)$, how would you calculate the fisher information of $x$ on $theta$, $J_x(theta)$?
    $endgroup$
    – segevp
    Mar 24 at 22:49










  • $begingroup$
    First, let me say that by reading the question again, I realized that this was a homework problem and that you were asking more for hints and references (so I should not have placed a full solution to your question!). Regarding the Fischer information, let us proceed by degrees. You have the expression for the Fischer information given by $mathcalJ_x(theta)=Eleft[left(fracpartialpartial theta log f(x;theta)right)^2left|thetaright.right]$ where $f$ is the distribution of $x$ with parameter $theta$. You know that $x$ is Poisson with mean $mu_x(theta)$.
    $endgroup$
    – Augusto S.
    Mar 24 at 23:39










  • $begingroup$
    Can you develop the expression above? $x$ is Poisson and hence, $f(x;theta)=e^-mu_x(theta)frac(mu_x(theta))^xx!$. Can you: (i) compute the log of this expression; (ii) compute the derivative with respect to $theta$ (I am assuming that $mu_x(theta)$ is differentiable); (iii) use the property for conditional expectations: $Eleft[Xg(Z)left|Zright.right]=g(Z)E[Xleft|Zright.]$ to further simplify the expression. See what is the expression that you get -- it will be given in terms of the derivative of $mu_x(theta)$ and $mu_x(theta)$ itself. Let me know if you have questions.
    $endgroup$
    – Augusto S.
    Mar 24 at 23:47











  • $begingroup$
    while this is true, you did helped me a lot with the homework, and I did learn much for that. I tried myself to solve it for days, so I did not had much choice. thank you again, everything is much clear now :)
    $endgroup$
    – segevp
    Mar 25 at 6:38






  • 1




    $begingroup$
    I am glad it helped!
    $endgroup$
    – Augusto S.
    Mar 25 at 7:07














2





+100







2





+100



2




+100



$begingroup$

(1) The MMSE is given by $widehaty=Eleft[y,middle|~xright]$ and the equation $y=x+n$ yields $$Eleft[y ,middle|~ x right] = Eleft[x ,middle|~ x right]+Eleft[n ,middle|~ x right] = x + E[n] = x+mu_n~,$$
where the second identity holds in view of the independence of $x$ and $n$.



In other words, $widehaty=x+mu_n$.



(2) The MMSE is given by $widehatx=Eleft[ x ,middle|~ y right]$. The relation $y=x+n$ yields $Eleft[x ,middle|~ y right] = y-Eleft[n ,middle|~ y right]$. As shown below, we have $Eleft[n ,middle|~ y right] = yleft(fracmu_nmu_n+mu_xright)$.



Therefore, $widehatx=yleft(fracmu_xmu_n+mu_xright)$.




We now show that $Eleft[n ,middle|~ y right]=yleft(fracmu_nmu_n+mu_xright)$. We have



$$Eleft[n ,middle|~ y=y' right]=sum_kleq y'kmathbbPleft(n=kleft|y=y'right.right)~,$$



and therefore



beginalign
mathbbPleft(n=k ,middle|~
y=y' right) & =fracmathbbPleft(n=k,,y=y'right)mathbbPleft(y=y'right) \
&=fracmathbbPleft(n=k,,x=y'-kright)mathbbPleft(y=y'right) \
&=fracmathbbPleft(n=kright)mathbbPleft(x=y'-kright)mathbbPleft(y=y'right) \
&= fracleft(e^-mu_nmu_n^k/k!right)left(e^-mu_xmu_x^y'-k/(y'-k)!right)left(e^-left(mu_n+mu_xright)left(mu_n+mu_xright)^y'/y'!right)=left(beginarrayc y' \ kendarrayright) left(fracmu_nmu_n+mu_xright)^kleft(fracmu_xmu_n+mu_xright)^y'-k~,
endalign



where the second to last equality above holds since $ysim sf Poisson(mu_n+mu_x)$, which follows from the independence of $x$ and $n$. In other words, for $y'$ fixed, $left(mathbbPleft(n=k ,middle|~y=y' right)right)_kleq y'$ is $sf Binomial(y',p)$ with $p= mu_n ,/(mu_n+mu_x)$ and thus



$$Eleft[n ,middle| y=y' right] = py' = y'left(fracmu_nmu_n+mu_xright)$$



That is, $Eleft[n ,middle| y right]=yleft(fracmu_nmu_n+mu_xright)$.






share|cite|improve this answer











$endgroup$



(1) The MMSE is given by $widehaty=Eleft[y,middle|~xright]$ and the equation $y=x+n$ yields $$Eleft[y ,middle|~ x right] = Eleft[x ,middle|~ x right]+Eleft[n ,middle|~ x right] = x + E[n] = x+mu_n~,$$
where the second identity holds in view of the independence of $x$ and $n$.



In other words, $widehaty=x+mu_n$.



(2) The MMSE is given by $widehatx=Eleft[ x ,middle|~ y right]$. The relation $y=x+n$ yields $Eleft[x ,middle|~ y right] = y-Eleft[n ,middle|~ y right]$. As shown below, we have $Eleft[n ,middle|~ y right] = yleft(fracmu_nmu_n+mu_xright)$.



Therefore, $widehatx=yleft(fracmu_xmu_n+mu_xright)$.




We now show that $Eleft[n ,middle|~ y right]=yleft(fracmu_nmu_n+mu_xright)$. We have



$$Eleft[n ,middle|~ y=y' right]=sum_kleq y'kmathbbPleft(n=kleft|y=y'right.right)~,$$



and therefore



beginalign
mathbbPleft(n=k ,middle|~
y=y' right) & =fracmathbbPleft(n=k,,y=y'right)mathbbPleft(y=y'right) \
&=fracmathbbPleft(n=k,,x=y'-kright)mathbbPleft(y=y'right) \
&=fracmathbbPleft(n=kright)mathbbPleft(x=y'-kright)mathbbPleft(y=y'right) \
&= fracleft(e^-mu_nmu_n^k/k!right)left(e^-mu_xmu_x^y'-k/(y'-k)!right)left(e^-left(mu_n+mu_xright)left(mu_n+mu_xright)^y'/y'!right)=left(beginarrayc y' \ kendarrayright) left(fracmu_nmu_n+mu_xright)^kleft(fracmu_xmu_n+mu_xright)^y'-k~,
endalign



where the second to last equality above holds since $ysim sf Poisson(mu_n+mu_x)$, which follows from the independence of $x$ and $n$. In other words, for $y'$ fixed, $left(mathbbPleft(n=k ,middle|~y=y' right)right)_kleq y'$ is $sf Binomial(y',p)$ with $p= mu_n ,/(mu_n+mu_x)$ and thus



$$Eleft[n ,middle| y=y' right] = py' = y'left(fracmu_nmu_n+mu_xright)$$



That is, $Eleft[n ,middle| y right]=yleft(fracmu_nmu_n+mu_xright)$.







share|cite|improve this answer














share|cite|improve this answer



share|cite|improve this answer








edited Mar 25 at 22:29









Lee David Chung Lin

4,47841242




4,47841242










answered Mar 24 at 20:51









Augusto S.Augusto S.

69649




69649







  • 1




    $begingroup$
    That is a great answer! thank you so much. If I may, I have another question related(this is where things get tricky!). If $mu_x=mu_x(theta)$, how would you calculate the fisher information of $x$ on $theta$, $J_x(theta)$?
    $endgroup$
    – segevp
    Mar 24 at 22:49










  • $begingroup$
    First, let me say that by reading the question again, I realized that this was a homework problem and that you were asking more for hints and references (so I should not have placed a full solution to your question!). Regarding the Fischer information, let us proceed by degrees. You have the expression for the Fischer information given by $mathcalJ_x(theta)=Eleft[left(fracpartialpartial theta log f(x;theta)right)^2left|thetaright.right]$ where $f$ is the distribution of $x$ with parameter $theta$. You know that $x$ is Poisson with mean $mu_x(theta)$.
    $endgroup$
    – Augusto S.
    Mar 24 at 23:39










  • $begingroup$
    Can you develop the expression above? $x$ is Poisson and hence, $f(x;theta)=e^-mu_x(theta)frac(mu_x(theta))^xx!$. Can you: (i) compute the log of this expression; (ii) compute the derivative with respect to $theta$ (I am assuming that $mu_x(theta)$ is differentiable); (iii) use the property for conditional expectations: $Eleft[Xg(Z)left|Zright.right]=g(Z)E[Xleft|Zright.]$ to further simplify the expression. See what is the expression that you get -- it will be given in terms of the derivative of $mu_x(theta)$ and $mu_x(theta)$ itself. Let me know if you have questions.
    $endgroup$
    – Augusto S.
    Mar 24 at 23:47











  • $begingroup$
    while this is true, you did helped me a lot with the homework, and I did learn much for that. I tried myself to solve it for days, so I did not had much choice. thank you again, everything is much clear now :)
    $endgroup$
    – segevp
    Mar 25 at 6:38






  • 1




    $begingroup$
    I am glad it helped!
    $endgroup$
    – Augusto S.
    Mar 25 at 7:07













  • 1




    $begingroup$
    That is a great answer! thank you so much. If I may, I have another question related(this is where things get tricky!). If $mu_x=mu_x(theta)$, how would you calculate the fisher information of $x$ on $theta$, $J_x(theta)$?
    $endgroup$
    – segevp
    Mar 24 at 22:49










  • $begingroup$
    First, let me say that by reading the question again, I realized that this was a homework problem and that you were asking more for hints and references (so I should not have placed a full solution to your question!). Regarding the Fischer information, let us proceed by degrees. You have the expression for the Fischer information given by $mathcalJ_x(theta)=Eleft[left(fracpartialpartial theta log f(x;theta)right)^2left|thetaright.right]$ where $f$ is the distribution of $x$ with parameter $theta$. You know that $x$ is Poisson with mean $mu_x(theta)$.
    $endgroup$
    – Augusto S.
    Mar 24 at 23:39










  • $begingroup$
    Can you develop the expression above? $x$ is Poisson and hence, $f(x;theta)=e^-mu_x(theta)frac(mu_x(theta))^xx!$. Can you: (i) compute the log of this expression; (ii) compute the derivative with respect to $theta$ (I am assuming that $mu_x(theta)$ is differentiable); (iii) use the property for conditional expectations: $Eleft[Xg(Z)left|Zright.right]=g(Z)E[Xleft|Zright.]$ to further simplify the expression. See what is the expression that you get -- it will be given in terms of the derivative of $mu_x(theta)$ and $mu_x(theta)$ itself. Let me know if you have questions.
    $endgroup$
    – Augusto S.
    Mar 24 at 23:47











  • $begingroup$
    while this is true, you did helped me a lot with the homework, and I did learn much for that. I tried myself to solve it for days, so I did not had much choice. thank you again, everything is much clear now :)
    $endgroup$
    – segevp
    Mar 25 at 6:38






  • 1




    $begingroup$
    I am glad it helped!
    $endgroup$
    – Augusto S.
    Mar 25 at 7:07








1




1




$begingroup$
That is a great answer! thank you so much. If I may, I have another question related(this is where things get tricky!). If $mu_x=mu_x(theta)$, how would you calculate the fisher information of $x$ on $theta$, $J_x(theta)$?
$endgroup$
– segevp
Mar 24 at 22:49




$begingroup$
That is a great answer! thank you so much. If I may, I have another question related(this is where things get tricky!). If $mu_x=mu_x(theta)$, how would you calculate the fisher information of $x$ on $theta$, $J_x(theta)$?
$endgroup$
– segevp
Mar 24 at 22:49












$begingroup$
First, let me say that by reading the question again, I realized that this was a homework problem and that you were asking more for hints and references (so I should not have placed a full solution to your question!). Regarding the Fischer information, let us proceed by degrees. You have the expression for the Fischer information given by $mathcalJ_x(theta)=Eleft[left(fracpartialpartial theta log f(x;theta)right)^2left|thetaright.right]$ where $f$ is the distribution of $x$ with parameter $theta$. You know that $x$ is Poisson with mean $mu_x(theta)$.
$endgroup$
– Augusto S.
Mar 24 at 23:39




$begingroup$
First, let me say that by reading the question again, I realized that this was a homework problem and that you were asking more for hints and references (so I should not have placed a full solution to your question!). Regarding the Fischer information, let us proceed by degrees. You have the expression for the Fischer information given by $mathcalJ_x(theta)=Eleft[left(fracpartialpartial theta log f(x;theta)right)^2left|thetaright.right]$ where $f$ is the distribution of $x$ with parameter $theta$. You know that $x$ is Poisson with mean $mu_x(theta)$.
$endgroup$
– Augusto S.
Mar 24 at 23:39












$begingroup$
Can you develop the expression above? $x$ is Poisson and hence, $f(x;theta)=e^-mu_x(theta)frac(mu_x(theta))^xx!$. Can you: (i) compute the log of this expression; (ii) compute the derivative with respect to $theta$ (I am assuming that $mu_x(theta)$ is differentiable); (iii) use the property for conditional expectations: $Eleft[Xg(Z)left|Zright.right]=g(Z)E[Xleft|Zright.]$ to further simplify the expression. See what is the expression that you get -- it will be given in terms of the derivative of $mu_x(theta)$ and $mu_x(theta)$ itself. Let me know if you have questions.
$endgroup$
– Augusto S.
Mar 24 at 23:47





$begingroup$
Can you develop the expression above? $x$ is Poisson and hence, $f(x;theta)=e^-mu_x(theta)frac(mu_x(theta))^xx!$. Can you: (i) compute the log of this expression; (ii) compute the derivative with respect to $theta$ (I am assuming that $mu_x(theta)$ is differentiable); (iii) use the property for conditional expectations: $Eleft[Xg(Z)left|Zright.right]=g(Z)E[Xleft|Zright.]$ to further simplify the expression. See what is the expression that you get -- it will be given in terms of the derivative of $mu_x(theta)$ and $mu_x(theta)$ itself. Let me know if you have questions.
$endgroup$
– Augusto S.
Mar 24 at 23:47













$begingroup$
while this is true, you did helped me a lot with the homework, and I did learn much for that. I tried myself to solve it for days, so I did not had much choice. thank you again, everything is much clear now :)
$endgroup$
– segevp
Mar 25 at 6:38




$begingroup$
while this is true, you did helped me a lot with the homework, and I did learn much for that. I tried myself to solve it for days, so I did not had much choice. thank you again, everything is much clear now :)
$endgroup$
– segevp
Mar 25 at 6:38




1




1




$begingroup$
I am glad it helped!
$endgroup$
– Augusto S.
Mar 25 at 7:07





$begingroup$
I am glad it helped!
$endgroup$
– Augusto S.
Mar 25 at 7:07


















draft saved

draft discarded
















































Thanks for contributing an answer to Mathematics Stack Exchange!


  • Please be sure to answer the question. Provide details and share your research!

But avoid


  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.

Use MathJax to format equations. MathJax reference.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3157230%2foptimal-estimator-of-poisson-distribution%23new-answer', 'question_page');

);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

How should I support this large drywall patch? Planned maintenance scheduled April 23, 2019 at 00:00UTC (8:00pm US/Eastern) Announcing the arrival of Valued Associate #679: Cesar Manara Unicorn Meta Zoo #1: Why another podcast?How do I cover large gaps in drywall?How do I keep drywall around a patch from crumbling?Can I glue a second layer of drywall?How to patch long strip on drywall?Large drywall patch: how to avoid bulging seams?Drywall Mesh Patch vs. Bulge? To remove or not to remove?How to fix this drywall job?Prep drywall before backsplashWhat's the best way to fix this horrible drywall patch job?Drywall patching using 3M Patch Plus Primer

random experiment with two different functions on unit interval Announcing the arrival of Valued Associate #679: Cesar Manara Planned maintenance scheduled April 23, 2019 at 00:00UTC (8:00pm US/Eastern)Random variable and probability space notionsRandom Walk with EdgesFinding functions where the increase over a random interval is Poisson distributedNumber of days until dayCan an observed event in fact be of zero probability?Unit random processmodels of coins and uniform distributionHow to get the number of successes given $n$ trials , probability $P$ and a random variable $X$Absorbing Markov chain in a computer. Is “almost every” turned into always convergence in computer executions?Stopped random walk is not uniformly integrable

Lowndes Grove History Architecture References Navigation menu32°48′6″N 79°57′58″W / 32.80167°N 79.96611°W / 32.80167; -79.9661132°48′6″N 79°57′58″W / 32.80167°N 79.96611°W / 32.80167; -79.9661178002500"National Register Information System"Historic houses of South Carolina"Lowndes Grove""+32° 48' 6.00", −79° 57' 58.00""Lowndes Grove, Charleston County (260 St. Margaret St., Charleston)""Lowndes Grove"The Charleston ExpositionIt Happened in South Carolina"Lowndes Grove (House), Saint Margaret Street & Sixth Avenue, Charleston, Charleston County, SC(Photographs)"Plantations of the Carolina Low Countrye