The Bayes predictor of the square loss is $Bbb E_P[Ymid X=x]$?What is the derivative of the cross entropy loss when extending an arbitrary predictor for multi class classification?Mean Square Error Minimization Conditioned On Multivariate Normal Random VariablesNotation in the derivative of the hinge loss functionsquare loss function in classificationProving that the Bayes optimal predictor is in fact optimalLet X : Ω → R be a random variable on a probability space that is normally distributed.The Bayes optimal predictor is optimalhinge loss vs. square of hinge loss componentsBayes (optimal) classifier for binary classification with asymmetric loss functionInequality for Log Concave Distributions
What would happen if the UK refused to take part in EU Parliamentary elections?
Was the picture area of a CRT a parallelogram (instead of a true rectangle)?
Everything Bob says is false. How does he get people to trust him?
Bash method for viewing beginning and end of file
Is HostGator storing my password in plaintext?
What would be the benefits of having both a state and local currencies?
Using parameter substitution on a Bash array
Why "be dealt cards" rather than "be dealing cards"?
Coordinate position not precise
Can criminal fraud exist without damages?
What is the term when two people sing in harmony, but they aren't singing the same notes?
Why is delta-v is the most useful quantity for planning space travel?
What to do with wrong results in talks?
How to be diplomatic in refusing to write code that breaches the privacy of our users
Should my PhD thesis be submitted under my legal name?
How can a jailer prevent the Forge Cleric's Artisan's Blessing from being used?
The baby cries all morning
apt-get update is failing in debian
Understanding "audieritis" in Psalm 94
Valid Badminton Score?
How will losing mobility of one hand affect my career as a programmer?
Mapping a list into a phase plot
How can I replace every global instance of "x[2]" with "x_2"
Your magic is very sketchy
The Bayes predictor of the square loss is $Bbb E_P[Ymid X=x]$?
What is the derivative of the cross entropy loss when extending an arbitrary predictor for multi class classification?Mean Square Error Minimization Conditioned On Multivariate Normal Random VariablesNotation in the derivative of the hinge loss functionsquare loss function in classificationProving that the Bayes optimal predictor is in fact optimalLet X : Ω → R be a random variable on a probability space that is normally distributed.The Bayes optimal predictor is optimalhinge loss vs. square of hinge loss componentsBayes (optimal) classifier for binary classification with asymmetric loss functionInequality for Log Concave Distributions
$begingroup$
Let $(X,Y) in Bbb X times Bbb Y$ be jointly distributed according
to distribution $P$. Let $h: Bbb X rightarrow tilde Bbb Y$,
where $tilde Bbb Y$ is a predicted output. $ $Let $L(h,P) equiv
Bbb E_P[l(Y, h(X))]$ where $l$ is some loss function.
Show that $f = arg min_h L(h,P) = Bbb E_p[Y mid X = x]$ if $l$ is the
square loss function: $l(Y, h(X)) = (y - h(x))^2$
I figured I show this by showing any other $h$ leads to a larger $L(h,P)$ than $Bbb E_P[Ymid X=x]$.
I start with $$Bbb E_P[(y - Bbb E_p[Ymid X=x])^2] le Bbb E_P[(y - h(x))^2]$$
Then expanding we have:
$$Bbb E_P[y^2-2yBbb E_p[Y|X=x] + Bbb E_P[Ymid X=x]^2] le Bbb E_P[y^2 - 2yh(x) + h(x)^2]$$
And simplifying:
$$-2Bbb E_P[y]Bbb E_p[Ymid X=x] + Bbb E_P[Ymid X=x]^2 le -2Bbb E_P[yh(x)] + Bbb E_P[h(x)^2]$$
But from here I'm a little stuck as to how to continue.
Does anyone have any ideas?
probability machine-learning
$endgroup$
This question has an open bounty worth +100
reputation from Oliver G ending ending at 2019-04-01 18:27:16Z">in 6 days.
Looking for an answer drawing from credible and/or official sources.
add a comment |
$begingroup$
Let $(X,Y) in Bbb X times Bbb Y$ be jointly distributed according
to distribution $P$. Let $h: Bbb X rightarrow tilde Bbb Y$,
where $tilde Bbb Y$ is a predicted output. $ $Let $L(h,P) equiv
Bbb E_P[l(Y, h(X))]$ where $l$ is some loss function.
Show that $f = arg min_h L(h,P) = Bbb E_p[Y mid X = x]$ if $l$ is the
square loss function: $l(Y, h(X)) = (y - h(x))^2$
I figured I show this by showing any other $h$ leads to a larger $L(h,P)$ than $Bbb E_P[Ymid X=x]$.
I start with $$Bbb E_P[(y - Bbb E_p[Ymid X=x])^2] le Bbb E_P[(y - h(x))^2]$$
Then expanding we have:
$$Bbb E_P[y^2-2yBbb E_p[Y|X=x] + Bbb E_P[Ymid X=x]^2] le Bbb E_P[y^2 - 2yh(x) + h(x)^2]$$
And simplifying:
$$-2Bbb E_P[y]Bbb E_p[Ymid X=x] + Bbb E_P[Ymid X=x]^2 le -2Bbb E_P[yh(x)] + Bbb E_P[h(x)^2]$$
But from here I'm a little stuck as to how to continue.
Does anyone have any ideas?
probability machine-learning
$endgroup$
This question has an open bounty worth +100
reputation from Oliver G ending ending at 2019-04-01 18:27:16Z">in 6 days.
Looking for an answer drawing from credible and/or official sources.
1
$begingroup$
You want to show that the conditional expectation minimises the square loss. You can find discussion of this here: stats.stackexchange.com/questions/71863/….
$endgroup$
– Minus One-Twelfth
Mar 17 at 14:04
$begingroup$
"I start with" You don't start with your desired conclusion. You start with what you know.
$endgroup$
– leonbloy
yesterday
$begingroup$
The link given by MinusOne-Twelfth has effectively answered the question in details. Is there anything else you'd like to know?
$endgroup$
– Saad
yesterday
add a comment |
$begingroup$
Let $(X,Y) in Bbb X times Bbb Y$ be jointly distributed according
to distribution $P$. Let $h: Bbb X rightarrow tilde Bbb Y$,
where $tilde Bbb Y$ is a predicted output. $ $Let $L(h,P) equiv
Bbb E_P[l(Y, h(X))]$ where $l$ is some loss function.
Show that $f = arg min_h L(h,P) = Bbb E_p[Y mid X = x]$ if $l$ is the
square loss function: $l(Y, h(X)) = (y - h(x))^2$
I figured I show this by showing any other $h$ leads to a larger $L(h,P)$ than $Bbb E_P[Ymid X=x]$.
I start with $$Bbb E_P[(y - Bbb E_p[Ymid X=x])^2] le Bbb E_P[(y - h(x))^2]$$
Then expanding we have:
$$Bbb E_P[y^2-2yBbb E_p[Y|X=x] + Bbb E_P[Ymid X=x]^2] le Bbb E_P[y^2 - 2yh(x) + h(x)^2]$$
And simplifying:
$$-2Bbb E_P[y]Bbb E_p[Ymid X=x] + Bbb E_P[Ymid X=x]^2 le -2Bbb E_P[yh(x)] + Bbb E_P[h(x)^2]$$
But from here I'm a little stuck as to how to continue.
Does anyone have any ideas?
probability machine-learning
$endgroup$
Let $(X,Y) in Bbb X times Bbb Y$ be jointly distributed according
to distribution $P$. Let $h: Bbb X rightarrow tilde Bbb Y$,
where $tilde Bbb Y$ is a predicted output. $ $Let $L(h,P) equiv
Bbb E_P[l(Y, h(X))]$ where $l$ is some loss function.
Show that $f = arg min_h L(h,P) = Bbb E_p[Y mid X = x]$ if $l$ is the
square loss function: $l(Y, h(X)) = (y - h(x))^2$
I figured I show this by showing any other $h$ leads to a larger $L(h,P)$ than $Bbb E_P[Ymid X=x]$.
I start with $$Bbb E_P[(y - Bbb E_p[Ymid X=x])^2] le Bbb E_P[(y - h(x))^2]$$
Then expanding we have:
$$Bbb E_P[y^2-2yBbb E_p[Y|X=x] + Bbb E_P[Ymid X=x]^2] le Bbb E_P[y^2 - 2yh(x) + h(x)^2]$$
And simplifying:
$$-2Bbb E_P[y]Bbb E_p[Ymid X=x] + Bbb E_P[Ymid X=x]^2 le -2Bbb E_P[yh(x)] + Bbb E_P[h(x)^2]$$
But from here I'm a little stuck as to how to continue.
Does anyone have any ideas?
probability machine-learning
probability machine-learning
edited Mar 17 at 12:39
Bernard
123k741117
123k741117
asked Mar 17 at 12:33
Oliver GOliver G
1,3651632
1,3651632
This question has an open bounty worth +100
reputation from Oliver G ending ending at 2019-04-01 18:27:16Z">in 6 days.
Looking for an answer drawing from credible and/or official sources.
This question has an open bounty worth +100
reputation from Oliver G ending ending at 2019-04-01 18:27:16Z">in 6 days.
Looking for an answer drawing from credible and/or official sources.
1
$begingroup$
You want to show that the conditional expectation minimises the square loss. You can find discussion of this here: stats.stackexchange.com/questions/71863/….
$endgroup$
– Minus One-Twelfth
Mar 17 at 14:04
$begingroup$
"I start with" You don't start with your desired conclusion. You start with what you know.
$endgroup$
– leonbloy
yesterday
$begingroup$
The link given by MinusOne-Twelfth has effectively answered the question in details. Is there anything else you'd like to know?
$endgroup$
– Saad
yesterday
add a comment |
1
$begingroup$
You want to show that the conditional expectation minimises the square loss. You can find discussion of this here: stats.stackexchange.com/questions/71863/….
$endgroup$
– Minus One-Twelfth
Mar 17 at 14:04
$begingroup$
"I start with" You don't start with your desired conclusion. You start with what you know.
$endgroup$
– leonbloy
yesterday
$begingroup$
The link given by MinusOne-Twelfth has effectively answered the question in details. Is there anything else you'd like to know?
$endgroup$
– Saad
yesterday
1
1
$begingroup$
You want to show that the conditional expectation minimises the square loss. You can find discussion of this here: stats.stackexchange.com/questions/71863/….
$endgroup$
– Minus One-Twelfth
Mar 17 at 14:04
$begingroup$
You want to show that the conditional expectation minimises the square loss. You can find discussion of this here: stats.stackexchange.com/questions/71863/….
$endgroup$
– Minus One-Twelfth
Mar 17 at 14:04
$begingroup$
"I start with" You don't start with your desired conclusion. You start with what you know.
$endgroup$
– leonbloy
yesterday
$begingroup$
"I start with" You don't start with your desired conclusion. You start with what you know.
$endgroup$
– leonbloy
yesterday
$begingroup$
The link given by MinusOne-Twelfth has effectively answered the question in details. Is there anything else you'd like to know?
$endgroup$
– Saad
yesterday
$begingroup$
The link given by MinusOne-Twelfth has effectively answered the question in details. Is there anything else you'd like to know?
$endgroup$
– Saad
yesterday
add a comment |
0
active
oldest
votes
Your Answer
StackExchange.ifUsing("editor", function ()
return StackExchange.using("mathjaxEditing", function ()
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
);
);
, "mathjax-editing");
StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "69"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);
else
createEditor();
);
function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);
);
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3151486%2fthe-bayes-predictor-of-the-square-loss-is-bbb-e-py-mid-x-x%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
0
active
oldest
votes
0
active
oldest
votes
active
oldest
votes
active
oldest
votes
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3151486%2fthe-bayes-predictor-of-the-square-loss-is-bbb-e-py-mid-x-x%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
1
$begingroup$
You want to show that the conditional expectation minimises the square loss. You can find discussion of this here: stats.stackexchange.com/questions/71863/….
$endgroup$
– Minus One-Twelfth
Mar 17 at 14:04
$begingroup$
"I start with" You don't start with your desired conclusion. You start with what you know.
$endgroup$
– leonbloy
yesterday
$begingroup$
The link given by MinusOne-Twelfth has effectively answered the question in details. Is there anything else you'd like to know?
$endgroup$
– Saad
yesterday