The Bayes predictor of the square loss is $Bbb E_P[Ymid X=x]$?What is the derivative of the cross entropy loss when extending an arbitrary predictor for multi class classification?Mean Square Error Minimization Conditioned On Multivariate Normal Random VariablesNotation in the derivative of the hinge loss functionsquare loss function in classificationProving that the Bayes optimal predictor is in fact optimalLet X : Ω → R be a random variable on a probability space that is normally distributed.The Bayes optimal predictor is optimalhinge loss vs. square of hinge loss componentsBayes (optimal) classifier for binary classification with asymmetric loss functionInequality for Log Concave Distributions

How will losing mobility of one hand affect my career as a programmer?

How can a jailer prevent the Forge Cleric's Artisan's Blessing from being used?

Products and sum of cubes in Fibonacci

What is the oldest known work of fiction?

Why does John Bercow say “unlock” after reading out the results of a vote?

Finding all intervals that match predicate in vector

I'm in charge of equipment buying but no one's ever happy with what I choose. How to fix this?

Why "be dealt cards" rather than "be dealing cards"?

Why did Kant, Hegel, and Adorno leave some words and phrases in the Greek alphabet?

Can I Retrieve Email Addresses from BCC?

Why Were Madagascar and New Zealand Discovered So Late?

Teaching indefinite integrals that require special-casing

Ways to speed up user implemented RK4

How does it work when somebody invests in my business?

Should my PhD thesis be submitted under my legal name?

Can somebody explain Brexit in a few child-proof sentences?

What would be the benefits of having both a state and local currencies?

Displaying the order of the columns of a table

Valid Badminton Score?

Lay out the Carpet

Is there an Impartial Brexit Deal comparison site?

How can I replace every global instance of "x[2]" with "x_2"

How could Frankenstein get the parts for his _second_ creature?

There is only s̶i̶x̶t̶y one place he can be



The Bayes predictor of the square loss is $Bbb E_P[Ymid X=x]$?


What is the derivative of the cross entropy loss when extending an arbitrary predictor for multi class classification?Mean Square Error Minimization Conditioned On Multivariate Normal Random VariablesNotation in the derivative of the hinge loss functionsquare loss function in classificationProving that the Bayes optimal predictor is in fact optimalLet X : Ω → R be a random variable on a probability space that is normally distributed.The Bayes optimal predictor is optimalhinge loss vs. square of hinge loss componentsBayes (optimal) classifier for binary classification with asymmetric loss functionInequality for Log Concave Distributions













0












$begingroup$



Let $(X,Y) in Bbb X times Bbb Y$ be jointly distributed according
to distribution $P$. Let $h: Bbb X rightarrow tilde Bbb Y$,
where $tilde Bbb Y$ is a predicted output. $ $Let $L(h,P) equiv
Bbb E_P[l(Y, h(X))]$
where $l$ is some loss function.



Show that $f = arg min_h L(h,P) = Bbb E_p[Y mid X = x]$ if $l$ is the
square loss function: $l(Y, h(X)) = (y - h(x))^2$




I figured I show this by showing any other $h$ leads to a larger $L(h,P)$ than $Bbb E_P[Ymid X=x]$.



I start with $$Bbb E_P[(y - Bbb E_p[Ymid X=x])^2] le Bbb E_P[(y - h(x))^2]$$



Then expanding we have:



$$Bbb E_P[y^2-2yBbb E_p[Y|X=x] + Bbb E_P[Ymid X=x]^2] le Bbb E_P[y^2 - 2yh(x) + h(x)^2]$$



And simplifying:



$$-2Bbb E_P[y]Bbb E_p[Ymid X=x] + Bbb E_P[Ymid X=x]^2 le -2Bbb E_P[yh(x)] + Bbb E_P[h(x)^2]$$



But from here I'm a little stuck as to how to continue.



Does anyone have any ideas?










share|cite|improve this question











$endgroup$





This question has an open bounty worth +100
reputation from Oliver G ending ending at 2019-04-01 18:27:16Z">in 6 days.


Looking for an answer drawing from credible and/or official sources.











  • 1




    $begingroup$
    You want to show that the conditional expectation minimises the square loss. You can find discussion of this here: stats.stackexchange.com/questions/71863/….
    $endgroup$
    – Minus One-Twelfth
    Mar 17 at 14:04










  • $begingroup$
    "I start with" You don't start with your desired conclusion. You start with what you know.
    $endgroup$
    – leonbloy
    yesterday










  • $begingroup$
    The link given by MinusOne-Twelfth has effectively answered the question in details. Is there anything else you'd like to know?
    $endgroup$
    – Saad
    yesterday















0












$begingroup$



Let $(X,Y) in Bbb X times Bbb Y$ be jointly distributed according
to distribution $P$. Let $h: Bbb X rightarrow tilde Bbb Y$,
where $tilde Bbb Y$ is a predicted output. $ $Let $L(h,P) equiv
Bbb E_P[l(Y, h(X))]$
where $l$ is some loss function.



Show that $f = arg min_h L(h,P) = Bbb E_p[Y mid X = x]$ if $l$ is the
square loss function: $l(Y, h(X)) = (y - h(x))^2$




I figured I show this by showing any other $h$ leads to a larger $L(h,P)$ than $Bbb E_P[Ymid X=x]$.



I start with $$Bbb E_P[(y - Bbb E_p[Ymid X=x])^2] le Bbb E_P[(y - h(x))^2]$$



Then expanding we have:



$$Bbb E_P[y^2-2yBbb E_p[Y|X=x] + Bbb E_P[Ymid X=x]^2] le Bbb E_P[y^2 - 2yh(x) + h(x)^2]$$



And simplifying:



$$-2Bbb E_P[y]Bbb E_p[Ymid X=x] + Bbb E_P[Ymid X=x]^2 le -2Bbb E_P[yh(x)] + Bbb E_P[h(x)^2]$$



But from here I'm a little stuck as to how to continue.



Does anyone have any ideas?










share|cite|improve this question











$endgroup$





This question has an open bounty worth +100
reputation from Oliver G ending ending at 2019-04-01 18:27:16Z">in 6 days.


Looking for an answer drawing from credible and/or official sources.











  • 1




    $begingroup$
    You want to show that the conditional expectation minimises the square loss. You can find discussion of this here: stats.stackexchange.com/questions/71863/….
    $endgroup$
    – Minus One-Twelfth
    Mar 17 at 14:04










  • $begingroup$
    "I start with" You don't start with your desired conclusion. You start with what you know.
    $endgroup$
    – leonbloy
    yesterday










  • $begingroup$
    The link given by MinusOne-Twelfth has effectively answered the question in details. Is there anything else you'd like to know?
    $endgroup$
    – Saad
    yesterday













0












0








0





$begingroup$



Let $(X,Y) in Bbb X times Bbb Y$ be jointly distributed according
to distribution $P$. Let $h: Bbb X rightarrow tilde Bbb Y$,
where $tilde Bbb Y$ is a predicted output. $ $Let $L(h,P) equiv
Bbb E_P[l(Y, h(X))]$
where $l$ is some loss function.



Show that $f = arg min_h L(h,P) = Bbb E_p[Y mid X = x]$ if $l$ is the
square loss function: $l(Y, h(X)) = (y - h(x))^2$




I figured I show this by showing any other $h$ leads to a larger $L(h,P)$ than $Bbb E_P[Ymid X=x]$.



I start with $$Bbb E_P[(y - Bbb E_p[Ymid X=x])^2] le Bbb E_P[(y - h(x))^2]$$



Then expanding we have:



$$Bbb E_P[y^2-2yBbb E_p[Y|X=x] + Bbb E_P[Ymid X=x]^2] le Bbb E_P[y^2 - 2yh(x) + h(x)^2]$$



And simplifying:



$$-2Bbb E_P[y]Bbb E_p[Ymid X=x] + Bbb E_P[Ymid X=x]^2 le -2Bbb E_P[yh(x)] + Bbb E_P[h(x)^2]$$



But from here I'm a little stuck as to how to continue.



Does anyone have any ideas?










share|cite|improve this question











$endgroup$





Let $(X,Y) in Bbb X times Bbb Y$ be jointly distributed according
to distribution $P$. Let $h: Bbb X rightarrow tilde Bbb Y$,
where $tilde Bbb Y$ is a predicted output. $ $Let $L(h,P) equiv
Bbb E_P[l(Y, h(X))]$
where $l$ is some loss function.



Show that $f = arg min_h L(h,P) = Bbb E_p[Y mid X = x]$ if $l$ is the
square loss function: $l(Y, h(X)) = (y - h(x))^2$




I figured I show this by showing any other $h$ leads to a larger $L(h,P)$ than $Bbb E_P[Ymid X=x]$.



I start with $$Bbb E_P[(y - Bbb E_p[Ymid X=x])^2] le Bbb E_P[(y - h(x))^2]$$



Then expanding we have:



$$Bbb E_P[y^2-2yBbb E_p[Y|X=x] + Bbb E_P[Ymid X=x]^2] le Bbb E_P[y^2 - 2yh(x) + h(x)^2]$$



And simplifying:



$$-2Bbb E_P[y]Bbb E_p[Ymid X=x] + Bbb E_P[Ymid X=x]^2 le -2Bbb E_P[yh(x)] + Bbb E_P[h(x)^2]$$



But from here I'm a little stuck as to how to continue.



Does anyone have any ideas?







probability machine-learning






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Mar 17 at 12:39









Bernard

123k741117




123k741117










asked Mar 17 at 12:33









Oliver GOliver G

1,3651632




1,3651632






This question has an open bounty worth +100
reputation from Oliver G ending ending at 2019-04-01 18:27:16Z">in 6 days.


Looking for an answer drawing from credible and/or official sources.








This question has an open bounty worth +100
reputation from Oliver G ending ending at 2019-04-01 18:27:16Z">in 6 days.


Looking for an answer drawing from credible and/or official sources.









  • 1




    $begingroup$
    You want to show that the conditional expectation minimises the square loss. You can find discussion of this here: stats.stackexchange.com/questions/71863/….
    $endgroup$
    – Minus One-Twelfth
    Mar 17 at 14:04










  • $begingroup$
    "I start with" You don't start with your desired conclusion. You start with what you know.
    $endgroup$
    – leonbloy
    yesterday










  • $begingroup$
    The link given by MinusOne-Twelfth has effectively answered the question in details. Is there anything else you'd like to know?
    $endgroup$
    – Saad
    yesterday












  • 1




    $begingroup$
    You want to show that the conditional expectation minimises the square loss. You can find discussion of this here: stats.stackexchange.com/questions/71863/….
    $endgroup$
    – Minus One-Twelfth
    Mar 17 at 14:04










  • $begingroup$
    "I start with" You don't start with your desired conclusion. You start with what you know.
    $endgroup$
    – leonbloy
    yesterday










  • $begingroup$
    The link given by MinusOne-Twelfth has effectively answered the question in details. Is there anything else you'd like to know?
    $endgroup$
    – Saad
    yesterday







1




1




$begingroup$
You want to show that the conditional expectation minimises the square loss. You can find discussion of this here: stats.stackexchange.com/questions/71863/….
$endgroup$
– Minus One-Twelfth
Mar 17 at 14:04




$begingroup$
You want to show that the conditional expectation minimises the square loss. You can find discussion of this here: stats.stackexchange.com/questions/71863/….
$endgroup$
– Minus One-Twelfth
Mar 17 at 14:04












$begingroup$
"I start with" You don't start with your desired conclusion. You start with what you know.
$endgroup$
– leonbloy
yesterday




$begingroup$
"I start with" You don't start with your desired conclusion. You start with what you know.
$endgroup$
– leonbloy
yesterday












$begingroup$
The link given by MinusOne-Twelfth has effectively answered the question in details. Is there anything else you'd like to know?
$endgroup$
– Saad
yesterday




$begingroup$
The link given by MinusOne-Twelfth has effectively answered the question in details. Is there anything else you'd like to know?
$endgroup$
– Saad
yesterday










0






active

oldest

votes











Your Answer





StackExchange.ifUsing("editor", function ()
return StackExchange.using("mathjaxEditing", function ()
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
);
);
, "mathjax-editing");

StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "69"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);

else
createEditor();

);

function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);



);













draft saved

draft discarded


















StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3151486%2fthe-bayes-predictor-of-the-square-loss-is-bbb-e-py-mid-x-x%23new-answer', 'question_page');

);

Post as a guest















Required, but never shown

























0






active

oldest

votes








0






active

oldest

votes









active

oldest

votes






active

oldest

votes















draft saved

draft discarded
















































Thanks for contributing an answer to Mathematics Stack Exchange!


  • Please be sure to answer the question. Provide details and share your research!

But avoid


  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.

Use MathJax to format equations. MathJax reference.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3151486%2fthe-bayes-predictor-of-the-square-loss-is-bbb-e-py-mid-x-x%23new-answer', 'question_page');

);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

How should I support this large drywall patch? Planned maintenance scheduled April 23, 2019 at 00:00UTC (8:00pm US/Eastern) Announcing the arrival of Valued Associate #679: Cesar Manara Unicorn Meta Zoo #1: Why another podcast?How do I cover large gaps in drywall?How do I keep drywall around a patch from crumbling?Can I glue a second layer of drywall?How to patch long strip on drywall?Large drywall patch: how to avoid bulging seams?Drywall Mesh Patch vs. Bulge? To remove or not to remove?How to fix this drywall job?Prep drywall before backsplashWhat's the best way to fix this horrible drywall patch job?Drywall patching using 3M Patch Plus Primer

random experiment with two different functions on unit interval Announcing the arrival of Valued Associate #679: Cesar Manara Planned maintenance scheduled April 23, 2019 at 00:00UTC (8:00pm US/Eastern)Random variable and probability space notionsRandom Walk with EdgesFinding functions where the increase over a random interval is Poisson distributedNumber of days until dayCan an observed event in fact be of zero probability?Unit random processmodels of coins and uniform distributionHow to get the number of successes given $n$ trials , probability $P$ and a random variable $X$Absorbing Markov chain in a computer. Is “almost every” turned into always convergence in computer executions?Stopped random walk is not uniformly integrable

Lowndes Grove History Architecture References Navigation menu32°48′6″N 79°57′58″W / 32.80167°N 79.96611°W / 32.80167; -79.9661132°48′6″N 79°57′58″W / 32.80167°N 79.96611°W / 32.80167; -79.9661178002500"National Register Information System"Historic houses of South Carolina"Lowndes Grove""+32° 48' 6.00", −79° 57' 58.00""Lowndes Grove, Charleston County (260 St. Margaret St., Charleston)""Lowndes Grove"The Charleston ExpositionIt Happened in South Carolina"Lowndes Grove (House), Saint Margaret Street & Sixth Avenue, Charleston, Charleston County, SC(Photographs)"Plantations of the Carolina Low Countrye