Optimal estimator of Poisson distributionWhat probability distribution is this?Bounds on least squares and weighted least squares estimatorFind the maximum of an integral function with respect to another functionGiven a Poisson-noisy signal, what is the noise distribution of its Fourier transform?Optimal estimation of the fusion of two measurementsComputing mean square error for linear transformThe distribution of a complex signalproblems and solutions for simple and harder optimal control theory?2d bimodal distribution filterCalculating the Mean Square Error (MSE) in Wavelet Denoising
Why does Kotter return in Welcome Back Kotter?
Why can't we play rap on piano?
Are the number of citations and number of published articles the most important criteria for a tenure promotion?
Rock identification in KY
Malformed Address '10.10.21.08/24', must be X.X.X.X/NN or
tikz convert color string to hex value
What would happen to a modern skyscraper if it rains micro blackholes?
Can a vampire attack twice with their claws using Multiattack?
What is a clear way to write a bar that has an extra beat?
Are astronomers waiting to see something in an image from a gravitational lens that they've already seen in an adjacent image?
Why is Minecraft giving an OpenGL error?
How does quantile regression compare to logistic regression with the variable split at the quantile?
Why are electrically insulating heatsinks so rare? Is it just cost?
How can I make my BBEG immortal short of making them a Lich or Vampire?
Is it inappropriate for a student to attend their mentor's dissertation defense?
Maximum likelihood parameters deviate from posterior distributions
LaTeX: Why are digits allowed in environments, but forbidden in commands?
Paid for article while in US on F-1 visa?
Add text to same line using sed
Is it possible to do 50 km distance without any previous training?
Intersection point of 2 lines defined by 2 points each
What typically incentivizes a professor to change jobs to a lower ranking university?
Can I ask the recruiters in my resume to put the reason why I am rejected?
Convert two switches to a dual stack, and add outlet - possible here?
Optimal estimator of Poisson distribution
What probability distribution is this?Bounds on least squares and weighted least squares estimatorFind the maximum of an integral function with respect to another functionGiven a Poisson-noisy signal, what is the noise distribution of its Fourier transform?Optimal estimation of the fusion of two measurementsComputing mean square error for linear transformThe distribution of a complex signalproblems and solutions for simple and harder optimal control theory?2d bimodal distribution filterCalculating the Mean Square Error (MSE) in Wavelet Denoising
$begingroup$
As part of my homework, I was asked to solve the following:
a signal and noise are stochastically independent and it is known that $nsim Poiss(mu_n)$ and $xsim Poiss(mu_x)$ we define $y=x+n$
- Find the optimal estimator(MMSE) of $y$ given $x$.
- Find the linear optimal estimator(MMSE) of $x$ given $y$.
I'm not familiar with optimal estimators, and I couldn't find a good source to study it.
Any help will be appreciated.
signal-processing online-resources
$endgroup$
add a comment |
$begingroup$
As part of my homework, I was asked to solve the following:
a signal and noise are stochastically independent and it is known that $nsim Poiss(mu_n)$ and $xsim Poiss(mu_x)$ we define $y=x+n$
- Find the optimal estimator(MMSE) of $y$ given $x$.
- Find the linear optimal estimator(MMSE) of $x$ given $y$.
I'm not familiar with optimal estimators, and I couldn't find a good source to study it.
Any help will be appreciated.
signal-processing online-resources
$endgroup$
$begingroup$
Since you mention MMSE, perhaps going through the wiki page might help.
$endgroup$
– StubbornAtom
Mar 21 at 19:12
add a comment |
$begingroup$
As part of my homework, I was asked to solve the following:
a signal and noise are stochastically independent and it is known that $nsim Poiss(mu_n)$ and $xsim Poiss(mu_x)$ we define $y=x+n$
- Find the optimal estimator(MMSE) of $y$ given $x$.
- Find the linear optimal estimator(MMSE) of $x$ given $y$.
I'm not familiar with optimal estimators, and I couldn't find a good source to study it.
Any help will be appreciated.
signal-processing online-resources
$endgroup$
As part of my homework, I was asked to solve the following:
a signal and noise are stochastically independent and it is known that $nsim Poiss(mu_n)$ and $xsim Poiss(mu_x)$ we define $y=x+n$
- Find the optimal estimator(MMSE) of $y$ given $x$.
- Find the linear optimal estimator(MMSE) of $x$ given $y$.
I'm not familiar with optimal estimators, and I couldn't find a good source to study it.
Any help will be appreciated.
signal-processing online-resources
signal-processing online-resources
edited Mar 24 at 15:36
segevp
asked Mar 21 at 18:44
segevpsegevp
508621
508621
$begingroup$
Since you mention MMSE, perhaps going through the wiki page might help.
$endgroup$
– StubbornAtom
Mar 21 at 19:12
add a comment |
$begingroup$
Since you mention MMSE, perhaps going through the wiki page might help.
$endgroup$
– StubbornAtom
Mar 21 at 19:12
$begingroup$
Since you mention MMSE, perhaps going through the wiki page might help.
$endgroup$
– StubbornAtom
Mar 21 at 19:12
$begingroup$
Since you mention MMSE, perhaps going through the wiki page might help.
$endgroup$
– StubbornAtom
Mar 21 at 19:12
add a comment |
1 Answer
1
active
oldest
votes
$begingroup$
(1) The MMSE is given by $widehaty=Eleft[y,middle|~xright]$ and the equation $y=x+n$ yields $$Eleft[y ,middle|~ x right] = Eleft[x ,middle|~ x right]+Eleft[n ,middle|~ x right] = x + E[n] = x+mu_n~,$$
where the second identity holds in view of the independence of $x$ and $n$.
In other words, $widehaty=x+mu_n$.
(2) The MMSE is given by $widehatx=Eleft[ x ,middle|~ y right]$. The relation $y=x+n$ yields $Eleft[x ,middle|~ y right] = y-Eleft[n ,middle|~ y right]$. As shown below, we have $Eleft[n ,middle|~ y right] = yleft(fracmu_nmu_n+mu_xright)$.
Therefore, $widehatx=yleft(fracmu_xmu_n+mu_xright)$.
We now show that $Eleft[n ,middle|~ y right]=yleft(fracmu_nmu_n+mu_xright)$. We have
$$Eleft[n ,middle|~ y=y' right]=sum_kleq y'kmathbbPleft(n=kleft|y=y'right.right)~,$$
and therefore
beginalign
mathbbPleft(n=k ,middle|~
y=y' right) & =fracmathbbPleft(n=k,,y=y'right)mathbbPleft(y=y'right) \
&=fracmathbbPleft(n=k,,x=y'-kright)mathbbPleft(y=y'right) \
&=fracmathbbPleft(n=kright)mathbbPleft(x=y'-kright)mathbbPleft(y=y'right) \
&= fracleft(e^-mu_nmu_n^k/k!right)left(e^-mu_xmu_x^y'-k/(y'-k)!right)left(e^-left(mu_n+mu_xright)left(mu_n+mu_xright)^y'/y'!right)=left(beginarrayc y' \ kendarrayright) left(fracmu_nmu_n+mu_xright)^kleft(fracmu_xmu_n+mu_xright)^y'-k~,
endalign
where the second to last equality above holds since $ysim sf Poisson(mu_n+mu_x)$, which follows from the independence of $x$ and $n$. In other words, for $y'$ fixed, $left(mathbbPleft(n=k ,middle|~y=y' right)right)_kleq y'$ is $sf Binomial(y',p)$ with $p= mu_n ,/(mu_n+mu_x)$ and thus
$$Eleft[n ,middle| y=y' right] = py' = y'left(fracmu_nmu_n+mu_xright)$$
That is, $Eleft[n ,middle| y right]=yleft(fracmu_nmu_n+mu_xright)$.
$endgroup$
1
$begingroup$
That is a great answer! thank you so much. If I may, I have another question related(this is where things get tricky!). If $mu_x=mu_x(theta)$, how would you calculate the fisher information of $x$ on $theta$, $J_x(theta)$?
$endgroup$
– segevp
Mar 24 at 22:49
$begingroup$
First, let me say that by reading the question again, I realized that this was a homework problem and that you were asking more for hints and references (so I should not have placed a full solution to your question!). Regarding the Fischer information, let us proceed by degrees. You have the expression for the Fischer information given by $mathcalJ_x(theta)=Eleft[left(fracpartialpartial theta log f(x;theta)right)^2left|thetaright.right]$ where $f$ is the distribution of $x$ with parameter $theta$. You know that $x$ is Poisson with mean $mu_x(theta)$.
$endgroup$
– Augusto S.
Mar 24 at 23:39
$begingroup$
Can you develop the expression above? $x$ is Poisson and hence, $f(x;theta)=e^-mu_x(theta)frac(mu_x(theta))^xx!$. Can you: (i) compute the log of this expression; (ii) compute the derivative with respect to $theta$ (I am assuming that $mu_x(theta)$ is differentiable); (iii) use the property for conditional expectations: $Eleft[Xg(Z)left|Zright.right]=g(Z)E[Xleft|Zright.]$ to further simplify the expression. See what is the expression that you get -- it will be given in terms of the derivative of $mu_x(theta)$ and $mu_x(theta)$ itself. Let me know if you have questions.
$endgroup$
– Augusto S.
Mar 24 at 23:47
$begingroup$
while this is true, you did helped me a lot with the homework, and I did learn much for that. I tried myself to solve it for days, so I did not had much choice. thank you again, everything is much clear now :)
$endgroup$
– segevp
Mar 25 at 6:38
1
$begingroup$
I am glad it helped!
$endgroup$
– Augusto S.
Mar 25 at 7:07
|
show 1 more comment
Your Answer
StackExchange.ifUsing("editor", function ()
return StackExchange.using("mathjaxEditing", function ()
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
);
);
, "mathjax-editing");
StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "69"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);
else
createEditor();
);
function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);
);
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3157230%2foptimal-estimator-of-poisson-distribution%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
(1) The MMSE is given by $widehaty=Eleft[y,middle|~xright]$ and the equation $y=x+n$ yields $$Eleft[y ,middle|~ x right] = Eleft[x ,middle|~ x right]+Eleft[n ,middle|~ x right] = x + E[n] = x+mu_n~,$$
where the second identity holds in view of the independence of $x$ and $n$.
In other words, $widehaty=x+mu_n$.
(2) The MMSE is given by $widehatx=Eleft[ x ,middle|~ y right]$. The relation $y=x+n$ yields $Eleft[x ,middle|~ y right] = y-Eleft[n ,middle|~ y right]$. As shown below, we have $Eleft[n ,middle|~ y right] = yleft(fracmu_nmu_n+mu_xright)$.
Therefore, $widehatx=yleft(fracmu_xmu_n+mu_xright)$.
We now show that $Eleft[n ,middle|~ y right]=yleft(fracmu_nmu_n+mu_xright)$. We have
$$Eleft[n ,middle|~ y=y' right]=sum_kleq y'kmathbbPleft(n=kleft|y=y'right.right)~,$$
and therefore
beginalign
mathbbPleft(n=k ,middle|~
y=y' right) & =fracmathbbPleft(n=k,,y=y'right)mathbbPleft(y=y'right) \
&=fracmathbbPleft(n=k,,x=y'-kright)mathbbPleft(y=y'right) \
&=fracmathbbPleft(n=kright)mathbbPleft(x=y'-kright)mathbbPleft(y=y'right) \
&= fracleft(e^-mu_nmu_n^k/k!right)left(e^-mu_xmu_x^y'-k/(y'-k)!right)left(e^-left(mu_n+mu_xright)left(mu_n+mu_xright)^y'/y'!right)=left(beginarrayc y' \ kendarrayright) left(fracmu_nmu_n+mu_xright)^kleft(fracmu_xmu_n+mu_xright)^y'-k~,
endalign
where the second to last equality above holds since $ysim sf Poisson(mu_n+mu_x)$, which follows from the independence of $x$ and $n$. In other words, for $y'$ fixed, $left(mathbbPleft(n=k ,middle|~y=y' right)right)_kleq y'$ is $sf Binomial(y',p)$ with $p= mu_n ,/(mu_n+mu_x)$ and thus
$$Eleft[n ,middle| y=y' right] = py' = y'left(fracmu_nmu_n+mu_xright)$$
That is, $Eleft[n ,middle| y right]=yleft(fracmu_nmu_n+mu_xright)$.
$endgroup$
1
$begingroup$
That is a great answer! thank you so much. If I may, I have another question related(this is where things get tricky!). If $mu_x=mu_x(theta)$, how would you calculate the fisher information of $x$ on $theta$, $J_x(theta)$?
$endgroup$
– segevp
Mar 24 at 22:49
$begingroup$
First, let me say that by reading the question again, I realized that this was a homework problem and that you were asking more for hints and references (so I should not have placed a full solution to your question!). Regarding the Fischer information, let us proceed by degrees. You have the expression for the Fischer information given by $mathcalJ_x(theta)=Eleft[left(fracpartialpartial theta log f(x;theta)right)^2left|thetaright.right]$ where $f$ is the distribution of $x$ with parameter $theta$. You know that $x$ is Poisson with mean $mu_x(theta)$.
$endgroup$
– Augusto S.
Mar 24 at 23:39
$begingroup$
Can you develop the expression above? $x$ is Poisson and hence, $f(x;theta)=e^-mu_x(theta)frac(mu_x(theta))^xx!$. Can you: (i) compute the log of this expression; (ii) compute the derivative with respect to $theta$ (I am assuming that $mu_x(theta)$ is differentiable); (iii) use the property for conditional expectations: $Eleft[Xg(Z)left|Zright.right]=g(Z)E[Xleft|Zright.]$ to further simplify the expression. See what is the expression that you get -- it will be given in terms of the derivative of $mu_x(theta)$ and $mu_x(theta)$ itself. Let me know if you have questions.
$endgroup$
– Augusto S.
Mar 24 at 23:47
$begingroup$
while this is true, you did helped me a lot with the homework, and I did learn much for that. I tried myself to solve it for days, so I did not had much choice. thank you again, everything is much clear now :)
$endgroup$
– segevp
Mar 25 at 6:38
1
$begingroup$
I am glad it helped!
$endgroup$
– Augusto S.
Mar 25 at 7:07
|
show 1 more comment
$begingroup$
(1) The MMSE is given by $widehaty=Eleft[y,middle|~xright]$ and the equation $y=x+n$ yields $$Eleft[y ,middle|~ x right] = Eleft[x ,middle|~ x right]+Eleft[n ,middle|~ x right] = x + E[n] = x+mu_n~,$$
where the second identity holds in view of the independence of $x$ and $n$.
In other words, $widehaty=x+mu_n$.
(2) The MMSE is given by $widehatx=Eleft[ x ,middle|~ y right]$. The relation $y=x+n$ yields $Eleft[x ,middle|~ y right] = y-Eleft[n ,middle|~ y right]$. As shown below, we have $Eleft[n ,middle|~ y right] = yleft(fracmu_nmu_n+mu_xright)$.
Therefore, $widehatx=yleft(fracmu_xmu_n+mu_xright)$.
We now show that $Eleft[n ,middle|~ y right]=yleft(fracmu_nmu_n+mu_xright)$. We have
$$Eleft[n ,middle|~ y=y' right]=sum_kleq y'kmathbbPleft(n=kleft|y=y'right.right)~,$$
and therefore
beginalign
mathbbPleft(n=k ,middle|~
y=y' right) & =fracmathbbPleft(n=k,,y=y'right)mathbbPleft(y=y'right) \
&=fracmathbbPleft(n=k,,x=y'-kright)mathbbPleft(y=y'right) \
&=fracmathbbPleft(n=kright)mathbbPleft(x=y'-kright)mathbbPleft(y=y'right) \
&= fracleft(e^-mu_nmu_n^k/k!right)left(e^-mu_xmu_x^y'-k/(y'-k)!right)left(e^-left(mu_n+mu_xright)left(mu_n+mu_xright)^y'/y'!right)=left(beginarrayc y' \ kendarrayright) left(fracmu_nmu_n+mu_xright)^kleft(fracmu_xmu_n+mu_xright)^y'-k~,
endalign
where the second to last equality above holds since $ysim sf Poisson(mu_n+mu_x)$, which follows from the independence of $x$ and $n$. In other words, for $y'$ fixed, $left(mathbbPleft(n=k ,middle|~y=y' right)right)_kleq y'$ is $sf Binomial(y',p)$ with $p= mu_n ,/(mu_n+mu_x)$ and thus
$$Eleft[n ,middle| y=y' right] = py' = y'left(fracmu_nmu_n+mu_xright)$$
That is, $Eleft[n ,middle| y right]=yleft(fracmu_nmu_n+mu_xright)$.
$endgroup$
1
$begingroup$
That is a great answer! thank you so much. If I may, I have another question related(this is where things get tricky!). If $mu_x=mu_x(theta)$, how would you calculate the fisher information of $x$ on $theta$, $J_x(theta)$?
$endgroup$
– segevp
Mar 24 at 22:49
$begingroup$
First, let me say that by reading the question again, I realized that this was a homework problem and that you were asking more for hints and references (so I should not have placed a full solution to your question!). Regarding the Fischer information, let us proceed by degrees. You have the expression for the Fischer information given by $mathcalJ_x(theta)=Eleft[left(fracpartialpartial theta log f(x;theta)right)^2left|thetaright.right]$ where $f$ is the distribution of $x$ with parameter $theta$. You know that $x$ is Poisson with mean $mu_x(theta)$.
$endgroup$
– Augusto S.
Mar 24 at 23:39
$begingroup$
Can you develop the expression above? $x$ is Poisson and hence, $f(x;theta)=e^-mu_x(theta)frac(mu_x(theta))^xx!$. Can you: (i) compute the log of this expression; (ii) compute the derivative with respect to $theta$ (I am assuming that $mu_x(theta)$ is differentiable); (iii) use the property for conditional expectations: $Eleft[Xg(Z)left|Zright.right]=g(Z)E[Xleft|Zright.]$ to further simplify the expression. See what is the expression that you get -- it will be given in terms of the derivative of $mu_x(theta)$ and $mu_x(theta)$ itself. Let me know if you have questions.
$endgroup$
– Augusto S.
Mar 24 at 23:47
$begingroup$
while this is true, you did helped me a lot with the homework, and I did learn much for that. I tried myself to solve it for days, so I did not had much choice. thank you again, everything is much clear now :)
$endgroup$
– segevp
Mar 25 at 6:38
1
$begingroup$
I am glad it helped!
$endgroup$
– Augusto S.
Mar 25 at 7:07
|
show 1 more comment
$begingroup$
(1) The MMSE is given by $widehaty=Eleft[y,middle|~xright]$ and the equation $y=x+n$ yields $$Eleft[y ,middle|~ x right] = Eleft[x ,middle|~ x right]+Eleft[n ,middle|~ x right] = x + E[n] = x+mu_n~,$$
where the second identity holds in view of the independence of $x$ and $n$.
In other words, $widehaty=x+mu_n$.
(2) The MMSE is given by $widehatx=Eleft[ x ,middle|~ y right]$. The relation $y=x+n$ yields $Eleft[x ,middle|~ y right] = y-Eleft[n ,middle|~ y right]$. As shown below, we have $Eleft[n ,middle|~ y right] = yleft(fracmu_nmu_n+mu_xright)$.
Therefore, $widehatx=yleft(fracmu_xmu_n+mu_xright)$.
We now show that $Eleft[n ,middle|~ y right]=yleft(fracmu_nmu_n+mu_xright)$. We have
$$Eleft[n ,middle|~ y=y' right]=sum_kleq y'kmathbbPleft(n=kleft|y=y'right.right)~,$$
and therefore
beginalign
mathbbPleft(n=k ,middle|~
y=y' right) & =fracmathbbPleft(n=k,,y=y'right)mathbbPleft(y=y'right) \
&=fracmathbbPleft(n=k,,x=y'-kright)mathbbPleft(y=y'right) \
&=fracmathbbPleft(n=kright)mathbbPleft(x=y'-kright)mathbbPleft(y=y'right) \
&= fracleft(e^-mu_nmu_n^k/k!right)left(e^-mu_xmu_x^y'-k/(y'-k)!right)left(e^-left(mu_n+mu_xright)left(mu_n+mu_xright)^y'/y'!right)=left(beginarrayc y' \ kendarrayright) left(fracmu_nmu_n+mu_xright)^kleft(fracmu_xmu_n+mu_xright)^y'-k~,
endalign
where the second to last equality above holds since $ysim sf Poisson(mu_n+mu_x)$, which follows from the independence of $x$ and $n$. In other words, for $y'$ fixed, $left(mathbbPleft(n=k ,middle|~y=y' right)right)_kleq y'$ is $sf Binomial(y',p)$ with $p= mu_n ,/(mu_n+mu_x)$ and thus
$$Eleft[n ,middle| y=y' right] = py' = y'left(fracmu_nmu_n+mu_xright)$$
That is, $Eleft[n ,middle| y right]=yleft(fracmu_nmu_n+mu_xright)$.
$endgroup$
(1) The MMSE is given by $widehaty=Eleft[y,middle|~xright]$ and the equation $y=x+n$ yields $$Eleft[y ,middle|~ x right] = Eleft[x ,middle|~ x right]+Eleft[n ,middle|~ x right] = x + E[n] = x+mu_n~,$$
where the second identity holds in view of the independence of $x$ and $n$.
In other words, $widehaty=x+mu_n$.
(2) The MMSE is given by $widehatx=Eleft[ x ,middle|~ y right]$. The relation $y=x+n$ yields $Eleft[x ,middle|~ y right] = y-Eleft[n ,middle|~ y right]$. As shown below, we have $Eleft[n ,middle|~ y right] = yleft(fracmu_nmu_n+mu_xright)$.
Therefore, $widehatx=yleft(fracmu_xmu_n+mu_xright)$.
We now show that $Eleft[n ,middle|~ y right]=yleft(fracmu_nmu_n+mu_xright)$. We have
$$Eleft[n ,middle|~ y=y' right]=sum_kleq y'kmathbbPleft(n=kleft|y=y'right.right)~,$$
and therefore
beginalign
mathbbPleft(n=k ,middle|~
y=y' right) & =fracmathbbPleft(n=k,,y=y'right)mathbbPleft(y=y'right) \
&=fracmathbbPleft(n=k,,x=y'-kright)mathbbPleft(y=y'right) \
&=fracmathbbPleft(n=kright)mathbbPleft(x=y'-kright)mathbbPleft(y=y'right) \
&= fracleft(e^-mu_nmu_n^k/k!right)left(e^-mu_xmu_x^y'-k/(y'-k)!right)left(e^-left(mu_n+mu_xright)left(mu_n+mu_xright)^y'/y'!right)=left(beginarrayc y' \ kendarrayright) left(fracmu_nmu_n+mu_xright)^kleft(fracmu_xmu_n+mu_xright)^y'-k~,
endalign
where the second to last equality above holds since $ysim sf Poisson(mu_n+mu_x)$, which follows from the independence of $x$ and $n$. In other words, for $y'$ fixed, $left(mathbbPleft(n=k ,middle|~y=y' right)right)_kleq y'$ is $sf Binomial(y',p)$ with $p= mu_n ,/(mu_n+mu_x)$ and thus
$$Eleft[n ,middle| y=y' right] = py' = y'left(fracmu_nmu_n+mu_xright)$$
That is, $Eleft[n ,middle| y right]=yleft(fracmu_nmu_n+mu_xright)$.
edited Mar 25 at 22:29
Lee David Chung Lin
4,47841242
4,47841242
answered Mar 24 at 20:51
Augusto S.Augusto S.
69649
69649
1
$begingroup$
That is a great answer! thank you so much. If I may, I have another question related(this is where things get tricky!). If $mu_x=mu_x(theta)$, how would you calculate the fisher information of $x$ on $theta$, $J_x(theta)$?
$endgroup$
– segevp
Mar 24 at 22:49
$begingroup$
First, let me say that by reading the question again, I realized that this was a homework problem and that you were asking more for hints and references (so I should not have placed a full solution to your question!). Regarding the Fischer information, let us proceed by degrees. You have the expression for the Fischer information given by $mathcalJ_x(theta)=Eleft[left(fracpartialpartial theta log f(x;theta)right)^2left|thetaright.right]$ where $f$ is the distribution of $x$ with parameter $theta$. You know that $x$ is Poisson with mean $mu_x(theta)$.
$endgroup$
– Augusto S.
Mar 24 at 23:39
$begingroup$
Can you develop the expression above? $x$ is Poisson and hence, $f(x;theta)=e^-mu_x(theta)frac(mu_x(theta))^xx!$. Can you: (i) compute the log of this expression; (ii) compute the derivative with respect to $theta$ (I am assuming that $mu_x(theta)$ is differentiable); (iii) use the property for conditional expectations: $Eleft[Xg(Z)left|Zright.right]=g(Z)E[Xleft|Zright.]$ to further simplify the expression. See what is the expression that you get -- it will be given in terms of the derivative of $mu_x(theta)$ and $mu_x(theta)$ itself. Let me know if you have questions.
$endgroup$
– Augusto S.
Mar 24 at 23:47
$begingroup$
while this is true, you did helped me a lot with the homework, and I did learn much for that. I tried myself to solve it for days, so I did not had much choice. thank you again, everything is much clear now :)
$endgroup$
– segevp
Mar 25 at 6:38
1
$begingroup$
I am glad it helped!
$endgroup$
– Augusto S.
Mar 25 at 7:07
|
show 1 more comment
1
$begingroup$
That is a great answer! thank you so much. If I may, I have another question related(this is where things get tricky!). If $mu_x=mu_x(theta)$, how would you calculate the fisher information of $x$ on $theta$, $J_x(theta)$?
$endgroup$
– segevp
Mar 24 at 22:49
$begingroup$
First, let me say that by reading the question again, I realized that this was a homework problem and that you were asking more for hints and references (so I should not have placed a full solution to your question!). Regarding the Fischer information, let us proceed by degrees. You have the expression for the Fischer information given by $mathcalJ_x(theta)=Eleft[left(fracpartialpartial theta log f(x;theta)right)^2left|thetaright.right]$ where $f$ is the distribution of $x$ with parameter $theta$. You know that $x$ is Poisson with mean $mu_x(theta)$.
$endgroup$
– Augusto S.
Mar 24 at 23:39
$begingroup$
Can you develop the expression above? $x$ is Poisson and hence, $f(x;theta)=e^-mu_x(theta)frac(mu_x(theta))^xx!$. Can you: (i) compute the log of this expression; (ii) compute the derivative with respect to $theta$ (I am assuming that $mu_x(theta)$ is differentiable); (iii) use the property for conditional expectations: $Eleft[Xg(Z)left|Zright.right]=g(Z)E[Xleft|Zright.]$ to further simplify the expression. See what is the expression that you get -- it will be given in terms of the derivative of $mu_x(theta)$ and $mu_x(theta)$ itself. Let me know if you have questions.
$endgroup$
– Augusto S.
Mar 24 at 23:47
$begingroup$
while this is true, you did helped me a lot with the homework, and I did learn much for that. I tried myself to solve it for days, so I did not had much choice. thank you again, everything is much clear now :)
$endgroup$
– segevp
Mar 25 at 6:38
1
$begingroup$
I am glad it helped!
$endgroup$
– Augusto S.
Mar 25 at 7:07
1
1
$begingroup$
That is a great answer! thank you so much. If I may, I have another question related(this is where things get tricky!). If $mu_x=mu_x(theta)$, how would you calculate the fisher information of $x$ on $theta$, $J_x(theta)$?
$endgroup$
– segevp
Mar 24 at 22:49
$begingroup$
That is a great answer! thank you so much. If I may, I have another question related(this is where things get tricky!). If $mu_x=mu_x(theta)$, how would you calculate the fisher information of $x$ on $theta$, $J_x(theta)$?
$endgroup$
– segevp
Mar 24 at 22:49
$begingroup$
First, let me say that by reading the question again, I realized that this was a homework problem and that you were asking more for hints and references (so I should not have placed a full solution to your question!). Regarding the Fischer information, let us proceed by degrees. You have the expression for the Fischer information given by $mathcalJ_x(theta)=Eleft[left(fracpartialpartial theta log f(x;theta)right)^2left|thetaright.right]$ where $f$ is the distribution of $x$ with parameter $theta$. You know that $x$ is Poisson with mean $mu_x(theta)$.
$endgroup$
– Augusto S.
Mar 24 at 23:39
$begingroup$
First, let me say that by reading the question again, I realized that this was a homework problem and that you were asking more for hints and references (so I should not have placed a full solution to your question!). Regarding the Fischer information, let us proceed by degrees. You have the expression for the Fischer information given by $mathcalJ_x(theta)=Eleft[left(fracpartialpartial theta log f(x;theta)right)^2left|thetaright.right]$ where $f$ is the distribution of $x$ with parameter $theta$. You know that $x$ is Poisson with mean $mu_x(theta)$.
$endgroup$
– Augusto S.
Mar 24 at 23:39
$begingroup$
Can you develop the expression above? $x$ is Poisson and hence, $f(x;theta)=e^-mu_x(theta)frac(mu_x(theta))^xx!$. Can you: (i) compute the log of this expression; (ii) compute the derivative with respect to $theta$ (I am assuming that $mu_x(theta)$ is differentiable); (iii) use the property for conditional expectations: $Eleft[Xg(Z)left|Zright.right]=g(Z)E[Xleft|Zright.]$ to further simplify the expression. See what is the expression that you get -- it will be given in terms of the derivative of $mu_x(theta)$ and $mu_x(theta)$ itself. Let me know if you have questions.
$endgroup$
– Augusto S.
Mar 24 at 23:47
$begingroup$
Can you develop the expression above? $x$ is Poisson and hence, $f(x;theta)=e^-mu_x(theta)frac(mu_x(theta))^xx!$. Can you: (i) compute the log of this expression; (ii) compute the derivative with respect to $theta$ (I am assuming that $mu_x(theta)$ is differentiable); (iii) use the property for conditional expectations: $Eleft[Xg(Z)left|Zright.right]=g(Z)E[Xleft|Zright.]$ to further simplify the expression. See what is the expression that you get -- it will be given in terms of the derivative of $mu_x(theta)$ and $mu_x(theta)$ itself. Let me know if you have questions.
$endgroup$
– Augusto S.
Mar 24 at 23:47
$begingroup$
while this is true, you did helped me a lot with the homework, and I did learn much for that. I tried myself to solve it for days, so I did not had much choice. thank you again, everything is much clear now :)
$endgroup$
– segevp
Mar 25 at 6:38
$begingroup$
while this is true, you did helped me a lot with the homework, and I did learn much for that. I tried myself to solve it for days, so I did not had much choice. thank you again, everything is much clear now :)
$endgroup$
– segevp
Mar 25 at 6:38
1
1
$begingroup$
I am glad it helped!
$endgroup$
– Augusto S.
Mar 25 at 7:07
$begingroup$
I am glad it helped!
$endgroup$
– Augusto S.
Mar 25 at 7:07
|
show 1 more comment
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3157230%2foptimal-estimator-of-poisson-distribution%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
$begingroup$
Since you mention MMSE, perhaps going through the wiki page might help.
$endgroup$
– StubbornAtom
Mar 21 at 19:12