Sum of squared differences question The 2019 Stack Overflow Developer Survey Results Are In Announcing the arrival of Valued Associate #679: Cesar Manara Planned maintenance scheduled April 17/18, 2019 at 00:00UTC (8:00pm US/Eastern)Construct a Confidence Interval of $95%$Why is variance squared?From sample mean and variance of $X$ to $sqrtX$Stuck on the cancellation of sums when calculating sample variance estimatesNumerator of Sample Variance ExpectationDivergence of Chi-squared statisticFor what value of $w$ is $(1-w)bar X_1 + wbar X_2$ the minimum variance unbiased estimator of $mu$R squared formula in linear regressionVariance of the SSBCalculating the MSE of a particular estimator

Word for: a synonym with a positive connotation?

Did God make two great lights or did He make the great light two?

I'm thinking of a number

Simulating Exploding Dice

Does Parliament hold absolute power in the UK?

Mortgage adviser recommends a longer term than necessary combined with overpayments

How is simplicity better than precision and clarity in prose?

Match Roman Numerals

Derivation tree not rendering

What is special about square numbers here?

How to politely respond to generic emails requesting a PhD/job in my lab? Without wasting too much time

What is this lever in Argentinian toilets?

How to pronounce 1ターン?

How long does the line of fire that you can create as an action using the Investiture of Flame spell last?

Does Parliament need to approve the new Brexit delay to 31 October 2019?

rotate text in posterbox

does high air pressure throw off wheel balance?

Make it rain characters

Is it ok to offer lower paid work as a trial period before negotiating for a full-time job?

Did the UK government pay "millions and millions of dollars" to try to snag Julian Assange?

Can the prologue be the backstory of your main character?

Semisimplicity of the category of coherent sheaves?

Is it ethical to upload a automatically generated paper to a non peer-reviewed site as part of a larger research?

Python - Fishing Simulator



Sum of squared differences question



The 2019 Stack Overflow Developer Survey Results Are In
Announcing the arrival of Valued Associate #679: Cesar Manara
Planned maintenance scheduled April 17/18, 2019 at 00:00UTC (8:00pm US/Eastern)Construct a Confidence Interval of $95%$Why is variance squared?From sample mean and variance of $X$ to $sqrtX$Stuck on the cancellation of sums when calculating sample variance estimatesNumerator of Sample Variance ExpectationDivergence of Chi-squared statisticFor what value of $w$ is $(1-w)bar X_1 + wbar X_2$ the minimum variance unbiased estimator of $mu$R squared formula in linear regressionVariance of the SSBCalculating the MSE of a particular estimator










0












$begingroup$


I'm completing a homework question and I need to prove the following:



$$
sum_i=1^N ||vecy - vecx_i||^2 = sum_i=1^N ||vecx_i - vecbarx||^2 + N||vecy-vecbarx||^2
$$



where $vecbarx = frac1N sum_i=1^N vecx_i$ and $||cdot||$ is the Euclidean norm and these are all vectors.



I think I probably need to do something similar to the proof for the sample variance but not sure how to modify it when I have a $y$ in now.










share|cite|improve this question











$endgroup$











  • $begingroup$
    Is this an Euclidean norm or a general norm?
    $endgroup$
    – Kavi Rama Murthy
    Mar 25 at 10:10










  • $begingroup$
    This is the Euclidean norm - I should've said, edited the question now
    $endgroup$
    – user-2147482565
    Mar 25 at 10:16






  • 1




    $begingroup$
    If you prove this for $y=0$ you can prove it for any $y$ by changing $x_i$ to $x_i-y$. For $y=0$ use brute force.
    $endgroup$
    – Kavi Rama Murthy
    Mar 25 at 10:24
















0












$begingroup$


I'm completing a homework question and I need to prove the following:



$$
sum_i=1^N ||vecy - vecx_i||^2 = sum_i=1^N ||vecx_i - vecbarx||^2 + N||vecy-vecbarx||^2
$$



where $vecbarx = frac1N sum_i=1^N vecx_i$ and $||cdot||$ is the Euclidean norm and these are all vectors.



I think I probably need to do something similar to the proof for the sample variance but not sure how to modify it when I have a $y$ in now.










share|cite|improve this question











$endgroup$











  • $begingroup$
    Is this an Euclidean norm or a general norm?
    $endgroup$
    – Kavi Rama Murthy
    Mar 25 at 10:10










  • $begingroup$
    This is the Euclidean norm - I should've said, edited the question now
    $endgroup$
    – user-2147482565
    Mar 25 at 10:16






  • 1




    $begingroup$
    If you prove this for $y=0$ you can prove it for any $y$ by changing $x_i$ to $x_i-y$. For $y=0$ use brute force.
    $endgroup$
    – Kavi Rama Murthy
    Mar 25 at 10:24














0












0








0





$begingroup$


I'm completing a homework question and I need to prove the following:



$$
sum_i=1^N ||vecy - vecx_i||^2 = sum_i=1^N ||vecx_i - vecbarx||^2 + N||vecy-vecbarx||^2
$$



where $vecbarx = frac1N sum_i=1^N vecx_i$ and $||cdot||$ is the Euclidean norm and these are all vectors.



I think I probably need to do something similar to the proof for the sample variance but not sure how to modify it when I have a $y$ in now.










share|cite|improve this question











$endgroup$




I'm completing a homework question and I need to prove the following:



$$
sum_i=1^N ||vecy - vecx_i||^2 = sum_i=1^N ||vecx_i - vecbarx||^2 + N||vecy-vecbarx||^2
$$



where $vecbarx = frac1N sum_i=1^N vecx_i$ and $||cdot||$ is the Euclidean norm and these are all vectors.



I think I probably need to do something similar to the proof for the sample variance but not sure how to modify it when I have a $y$ in now.







statistics summation proof-explanation statistical-inference






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Mar 25 at 16:32







user-2147482565

















asked Mar 25 at 10:06









user-2147482565user-2147482565

133




133











  • $begingroup$
    Is this an Euclidean norm or a general norm?
    $endgroup$
    – Kavi Rama Murthy
    Mar 25 at 10:10










  • $begingroup$
    This is the Euclidean norm - I should've said, edited the question now
    $endgroup$
    – user-2147482565
    Mar 25 at 10:16






  • 1




    $begingroup$
    If you prove this for $y=0$ you can prove it for any $y$ by changing $x_i$ to $x_i-y$. For $y=0$ use brute force.
    $endgroup$
    – Kavi Rama Murthy
    Mar 25 at 10:24

















  • $begingroup$
    Is this an Euclidean norm or a general norm?
    $endgroup$
    – Kavi Rama Murthy
    Mar 25 at 10:10










  • $begingroup$
    This is the Euclidean norm - I should've said, edited the question now
    $endgroup$
    – user-2147482565
    Mar 25 at 10:16






  • 1




    $begingroup$
    If you prove this for $y=0$ you can prove it for any $y$ by changing $x_i$ to $x_i-y$. For $y=0$ use brute force.
    $endgroup$
    – Kavi Rama Murthy
    Mar 25 at 10:24
















$begingroup$
Is this an Euclidean norm or a general norm?
$endgroup$
– Kavi Rama Murthy
Mar 25 at 10:10




$begingroup$
Is this an Euclidean norm or a general norm?
$endgroup$
– Kavi Rama Murthy
Mar 25 at 10:10












$begingroup$
This is the Euclidean norm - I should've said, edited the question now
$endgroup$
– user-2147482565
Mar 25 at 10:16




$begingroup$
This is the Euclidean norm - I should've said, edited the question now
$endgroup$
– user-2147482565
Mar 25 at 10:16




1




1




$begingroup$
If you prove this for $y=0$ you can prove it for any $y$ by changing $x_i$ to $x_i-y$. For $y=0$ use brute force.
$endgroup$
– Kavi Rama Murthy
Mar 25 at 10:24





$begingroup$
If you prove this for $y=0$ you can prove it for any $y$ by changing $x_i$ to $x_i-y$. For $y=0$ use brute force.
$endgroup$
– Kavi Rama Murthy
Mar 25 at 10:24











2 Answers
2






active

oldest

votes


















1












$begingroup$

Hint:



$$(y-x_i)(y-x_i)=((y-overline x)+(overline x-x_i))((y-overline x)+(overline x-x_i))
\=(y-overline x)(y-overline x)+2(y-overline x)(overline x-x_i)+(overline x-x_i)(overline x-x_i).$$



Now take the average over $i$.






share|cite|improve this answer









$endgroup$












  • $begingroup$
    But I have norms here, if I take the sum over $i$ and we have norms instead of brackets, how does the cross products cancel out?
    $endgroup$
    – user-2147482565
    Mar 25 at 12:38










  • $begingroup$
    @RyanChan: there are no cross products here.
    $endgroup$
    – Yves Daoust
    Mar 25 at 13:07











  • $begingroup$
    Sorry, I meant cross product not as in the dot product, but the terms $2||y-barx|| ||barx-x_i||$
    $endgroup$
    – user-2147482565
    Mar 25 at 13:11











  • $begingroup$
    @RyanChan: there are no such terms in the development. When averaging, the middle dot product cancels out.
    $endgroup$
    – Yves Daoust
    Mar 25 at 13:12











  • $begingroup$
    Sorry I'm just confused - the term is in the hint that you gave?
    $endgroup$
    – user-2147482565
    Mar 25 at 13:13


















0












$begingroup$

beginalign*
sum_i=1^N ||vecy - vecx_i||^2 & = sum_i=1^N ||(vecy - vecbarx_i) + (vecbarx - vecx_i)||^2 \
& = sum_i=1^N ((vecy - vecbarx) + (vecbarx - vecx_i)) ((vecy - vecbarx) + (vecbarx - vecx_i))^T \
& = sum_i=1^N (vecy - vecbarx)(vecy - vecbarx)^T + 2 sum_i=1^N (vecy - vecbarx)(vecbarx - vecx_i)^T + sum_i=1^N (vecbarx - vecx_i)(vecbarx - vecx_i)^T \
& = sum_i=1^N || vecbarx - vecx_i ||^2 + N ||vecy - vecbarx||^2
endalign*



since



beginalign*
sum_i=1^N (vecy - vecbarx)(vecbarx - vecx_i)^T & = sum_i=1^N sum_j (y_j - barx_j)(barx_j - x_ij) \
& = sum_i=1^N sum_j (y_jbarx_j - y_jx_ij - barx_j^2 + barx_jx_ij) \
& = sum_j Bigg[ Ny_jbarx_j - y_j sum_i=1^Nx_ij - Nbarx_j^2 + barx_jsum_i=1^Nx_ij Bigg] \
& = sum_j Bigg[ Ny_jbarx_j - y_jNbarx_j - N barx_j^2 + barx_jNbarx_j Bigg] \
& = sum_j 0 \
& = 0
endalign*






share|cite|improve this answer











$endgroup$













    Your Answer








    StackExchange.ready(function()
    var channelOptions =
    tags: "".split(" "),
    id: "69"
    ;
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function()
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled)
    StackExchange.using("snippets", function()
    createEditor();
    );

    else
    createEditor();

    );

    function createEditor()
    StackExchange.prepareEditor(
    heartbeatType: 'answer',
    autoActivateHeartbeat: false,
    convertImagesToLinks: true,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    imageUploader:
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    ,
    noCode: true, onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    );



    );













    draft saved

    draft discarded


















    StackExchange.ready(
    function ()
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3161595%2fsum-of-squared-differences-question%23new-answer', 'question_page');

    );

    Post as a guest















    Required, but never shown

























    2 Answers
    2






    active

    oldest

    votes








    2 Answers
    2






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    1












    $begingroup$

    Hint:



    $$(y-x_i)(y-x_i)=((y-overline x)+(overline x-x_i))((y-overline x)+(overline x-x_i))
    \=(y-overline x)(y-overline x)+2(y-overline x)(overline x-x_i)+(overline x-x_i)(overline x-x_i).$$



    Now take the average over $i$.






    share|cite|improve this answer









    $endgroup$












    • $begingroup$
      But I have norms here, if I take the sum over $i$ and we have norms instead of brackets, how does the cross products cancel out?
      $endgroup$
      – user-2147482565
      Mar 25 at 12:38










    • $begingroup$
      @RyanChan: there are no cross products here.
      $endgroup$
      – Yves Daoust
      Mar 25 at 13:07











    • $begingroup$
      Sorry, I meant cross product not as in the dot product, but the terms $2||y-barx|| ||barx-x_i||$
      $endgroup$
      – user-2147482565
      Mar 25 at 13:11











    • $begingroup$
      @RyanChan: there are no such terms in the development. When averaging, the middle dot product cancels out.
      $endgroup$
      – Yves Daoust
      Mar 25 at 13:12











    • $begingroup$
      Sorry I'm just confused - the term is in the hint that you gave?
      $endgroup$
      – user-2147482565
      Mar 25 at 13:13















    1












    $begingroup$

    Hint:



    $$(y-x_i)(y-x_i)=((y-overline x)+(overline x-x_i))((y-overline x)+(overline x-x_i))
    \=(y-overline x)(y-overline x)+2(y-overline x)(overline x-x_i)+(overline x-x_i)(overline x-x_i).$$



    Now take the average over $i$.






    share|cite|improve this answer









    $endgroup$












    • $begingroup$
      But I have norms here, if I take the sum over $i$ and we have norms instead of brackets, how does the cross products cancel out?
      $endgroup$
      – user-2147482565
      Mar 25 at 12:38










    • $begingroup$
      @RyanChan: there are no cross products here.
      $endgroup$
      – Yves Daoust
      Mar 25 at 13:07











    • $begingroup$
      Sorry, I meant cross product not as in the dot product, but the terms $2||y-barx|| ||barx-x_i||$
      $endgroup$
      – user-2147482565
      Mar 25 at 13:11











    • $begingroup$
      @RyanChan: there are no such terms in the development. When averaging, the middle dot product cancels out.
      $endgroup$
      – Yves Daoust
      Mar 25 at 13:12











    • $begingroup$
      Sorry I'm just confused - the term is in the hint that you gave?
      $endgroup$
      – user-2147482565
      Mar 25 at 13:13













    1












    1








    1





    $begingroup$

    Hint:



    $$(y-x_i)(y-x_i)=((y-overline x)+(overline x-x_i))((y-overline x)+(overline x-x_i))
    \=(y-overline x)(y-overline x)+2(y-overline x)(overline x-x_i)+(overline x-x_i)(overline x-x_i).$$



    Now take the average over $i$.






    share|cite|improve this answer









    $endgroup$



    Hint:



    $$(y-x_i)(y-x_i)=((y-overline x)+(overline x-x_i))((y-overline x)+(overline x-x_i))
    \=(y-overline x)(y-overline x)+2(y-overline x)(overline x-x_i)+(overline x-x_i)(overline x-x_i).$$



    Now take the average over $i$.







    share|cite|improve this answer












    share|cite|improve this answer



    share|cite|improve this answer










    answered Mar 25 at 11:06









    Yves DaoustYves Daoust

    133k676231




    133k676231











    • $begingroup$
      But I have norms here, if I take the sum over $i$ and we have norms instead of brackets, how does the cross products cancel out?
      $endgroup$
      – user-2147482565
      Mar 25 at 12:38










    • $begingroup$
      @RyanChan: there are no cross products here.
      $endgroup$
      – Yves Daoust
      Mar 25 at 13:07











    • $begingroup$
      Sorry, I meant cross product not as in the dot product, but the terms $2||y-barx|| ||barx-x_i||$
      $endgroup$
      – user-2147482565
      Mar 25 at 13:11











    • $begingroup$
      @RyanChan: there are no such terms in the development. When averaging, the middle dot product cancels out.
      $endgroup$
      – Yves Daoust
      Mar 25 at 13:12











    • $begingroup$
      Sorry I'm just confused - the term is in the hint that you gave?
      $endgroup$
      – user-2147482565
      Mar 25 at 13:13
















    • $begingroup$
      But I have norms here, if I take the sum over $i$ and we have norms instead of brackets, how does the cross products cancel out?
      $endgroup$
      – user-2147482565
      Mar 25 at 12:38










    • $begingroup$
      @RyanChan: there are no cross products here.
      $endgroup$
      – Yves Daoust
      Mar 25 at 13:07











    • $begingroup$
      Sorry, I meant cross product not as in the dot product, but the terms $2||y-barx|| ||barx-x_i||$
      $endgroup$
      – user-2147482565
      Mar 25 at 13:11











    • $begingroup$
      @RyanChan: there are no such terms in the development. When averaging, the middle dot product cancels out.
      $endgroup$
      – Yves Daoust
      Mar 25 at 13:12











    • $begingroup$
      Sorry I'm just confused - the term is in the hint that you gave?
      $endgroup$
      – user-2147482565
      Mar 25 at 13:13















    $begingroup$
    But I have norms here, if I take the sum over $i$ and we have norms instead of brackets, how does the cross products cancel out?
    $endgroup$
    – user-2147482565
    Mar 25 at 12:38




    $begingroup$
    But I have norms here, if I take the sum over $i$ and we have norms instead of brackets, how does the cross products cancel out?
    $endgroup$
    – user-2147482565
    Mar 25 at 12:38












    $begingroup$
    @RyanChan: there are no cross products here.
    $endgroup$
    – Yves Daoust
    Mar 25 at 13:07





    $begingroup$
    @RyanChan: there are no cross products here.
    $endgroup$
    – Yves Daoust
    Mar 25 at 13:07













    $begingroup$
    Sorry, I meant cross product not as in the dot product, but the terms $2||y-barx|| ||barx-x_i||$
    $endgroup$
    – user-2147482565
    Mar 25 at 13:11





    $begingroup$
    Sorry, I meant cross product not as in the dot product, but the terms $2||y-barx|| ||barx-x_i||$
    $endgroup$
    – user-2147482565
    Mar 25 at 13:11













    $begingroup$
    @RyanChan: there are no such terms in the development. When averaging, the middle dot product cancels out.
    $endgroup$
    – Yves Daoust
    Mar 25 at 13:12





    $begingroup$
    @RyanChan: there are no such terms in the development. When averaging, the middle dot product cancels out.
    $endgroup$
    – Yves Daoust
    Mar 25 at 13:12













    $begingroup$
    Sorry I'm just confused - the term is in the hint that you gave?
    $endgroup$
    – user-2147482565
    Mar 25 at 13:13




    $begingroup$
    Sorry I'm just confused - the term is in the hint that you gave?
    $endgroup$
    – user-2147482565
    Mar 25 at 13:13











    0












    $begingroup$

    beginalign*
    sum_i=1^N ||vecy - vecx_i||^2 & = sum_i=1^N ||(vecy - vecbarx_i) + (vecbarx - vecx_i)||^2 \
    & = sum_i=1^N ((vecy - vecbarx) + (vecbarx - vecx_i)) ((vecy - vecbarx) + (vecbarx - vecx_i))^T \
    & = sum_i=1^N (vecy - vecbarx)(vecy - vecbarx)^T + 2 sum_i=1^N (vecy - vecbarx)(vecbarx - vecx_i)^T + sum_i=1^N (vecbarx - vecx_i)(vecbarx - vecx_i)^T \
    & = sum_i=1^N || vecbarx - vecx_i ||^2 + N ||vecy - vecbarx||^2
    endalign*



    since



    beginalign*
    sum_i=1^N (vecy - vecbarx)(vecbarx - vecx_i)^T & = sum_i=1^N sum_j (y_j - barx_j)(barx_j - x_ij) \
    & = sum_i=1^N sum_j (y_jbarx_j - y_jx_ij - barx_j^2 + barx_jx_ij) \
    & = sum_j Bigg[ Ny_jbarx_j - y_j sum_i=1^Nx_ij - Nbarx_j^2 + barx_jsum_i=1^Nx_ij Bigg] \
    & = sum_j Bigg[ Ny_jbarx_j - y_jNbarx_j - N barx_j^2 + barx_jNbarx_j Bigg] \
    & = sum_j 0 \
    & = 0
    endalign*






    share|cite|improve this answer











    $endgroup$

















      0












      $begingroup$

      beginalign*
      sum_i=1^N ||vecy - vecx_i||^2 & = sum_i=1^N ||(vecy - vecbarx_i) + (vecbarx - vecx_i)||^2 \
      & = sum_i=1^N ((vecy - vecbarx) + (vecbarx - vecx_i)) ((vecy - vecbarx) + (vecbarx - vecx_i))^T \
      & = sum_i=1^N (vecy - vecbarx)(vecy - vecbarx)^T + 2 sum_i=1^N (vecy - vecbarx)(vecbarx - vecx_i)^T + sum_i=1^N (vecbarx - vecx_i)(vecbarx - vecx_i)^T \
      & = sum_i=1^N || vecbarx - vecx_i ||^2 + N ||vecy - vecbarx||^2
      endalign*



      since



      beginalign*
      sum_i=1^N (vecy - vecbarx)(vecbarx - vecx_i)^T & = sum_i=1^N sum_j (y_j - barx_j)(barx_j - x_ij) \
      & = sum_i=1^N sum_j (y_jbarx_j - y_jx_ij - barx_j^2 + barx_jx_ij) \
      & = sum_j Bigg[ Ny_jbarx_j - y_j sum_i=1^Nx_ij - Nbarx_j^2 + barx_jsum_i=1^Nx_ij Bigg] \
      & = sum_j Bigg[ Ny_jbarx_j - y_jNbarx_j - N barx_j^2 + barx_jNbarx_j Bigg] \
      & = sum_j 0 \
      & = 0
      endalign*






      share|cite|improve this answer











      $endgroup$















        0












        0








        0





        $begingroup$

        beginalign*
        sum_i=1^N ||vecy - vecx_i||^2 & = sum_i=1^N ||(vecy - vecbarx_i) + (vecbarx - vecx_i)||^2 \
        & = sum_i=1^N ((vecy - vecbarx) + (vecbarx - vecx_i)) ((vecy - vecbarx) + (vecbarx - vecx_i))^T \
        & = sum_i=1^N (vecy - vecbarx)(vecy - vecbarx)^T + 2 sum_i=1^N (vecy - vecbarx)(vecbarx - vecx_i)^T + sum_i=1^N (vecbarx - vecx_i)(vecbarx - vecx_i)^T \
        & = sum_i=1^N || vecbarx - vecx_i ||^2 + N ||vecy - vecbarx||^2
        endalign*



        since



        beginalign*
        sum_i=1^N (vecy - vecbarx)(vecbarx - vecx_i)^T & = sum_i=1^N sum_j (y_j - barx_j)(barx_j - x_ij) \
        & = sum_i=1^N sum_j (y_jbarx_j - y_jx_ij - barx_j^2 + barx_jx_ij) \
        & = sum_j Bigg[ Ny_jbarx_j - y_j sum_i=1^Nx_ij - Nbarx_j^2 + barx_jsum_i=1^Nx_ij Bigg] \
        & = sum_j Bigg[ Ny_jbarx_j - y_jNbarx_j - N barx_j^2 + barx_jNbarx_j Bigg] \
        & = sum_j 0 \
        & = 0
        endalign*






        share|cite|improve this answer











        $endgroup$



        beginalign*
        sum_i=1^N ||vecy - vecx_i||^2 & = sum_i=1^N ||(vecy - vecbarx_i) + (vecbarx - vecx_i)||^2 \
        & = sum_i=1^N ((vecy - vecbarx) + (vecbarx - vecx_i)) ((vecy - vecbarx) + (vecbarx - vecx_i))^T \
        & = sum_i=1^N (vecy - vecbarx)(vecy - vecbarx)^T + 2 sum_i=1^N (vecy - vecbarx)(vecbarx - vecx_i)^T + sum_i=1^N (vecbarx - vecx_i)(vecbarx - vecx_i)^T \
        & = sum_i=1^N || vecbarx - vecx_i ||^2 + N ||vecy - vecbarx||^2
        endalign*



        since



        beginalign*
        sum_i=1^N (vecy - vecbarx)(vecbarx - vecx_i)^T & = sum_i=1^N sum_j (y_j - barx_j)(barx_j - x_ij) \
        & = sum_i=1^N sum_j (y_jbarx_j - y_jx_ij - barx_j^2 + barx_jx_ij) \
        & = sum_j Bigg[ Ny_jbarx_j - y_j sum_i=1^Nx_ij - Nbarx_j^2 + barx_jsum_i=1^Nx_ij Bigg] \
        & = sum_j Bigg[ Ny_jbarx_j - y_jNbarx_j - N barx_j^2 + barx_jNbarx_j Bigg] \
        & = sum_j 0 \
        & = 0
        endalign*







        share|cite|improve this answer














        share|cite|improve this answer



        share|cite|improve this answer








        edited Mar 26 at 10:15

























        answered Mar 26 at 9:14









        user-2147482565user-2147482565

        133




        133



























            draft saved

            draft discarded
















































            Thanks for contributing an answer to Mathematics Stack Exchange!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid


            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.

            Use MathJax to format equations. MathJax reference.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function ()
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3161595%2fsum-of-squared-differences-question%23new-answer', 'question_page');

            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            Lowndes Grove History Architecture References Navigation menu32°48′6″N 79°57′58″W / 32.80167°N 79.96611°W / 32.80167; -79.9661132°48′6″N 79°57′58″W / 32.80167°N 79.96611°W / 32.80167; -79.9661178002500"National Register Information System"Historic houses of South Carolina"Lowndes Grove""+32° 48' 6.00", −79° 57' 58.00""Lowndes Grove, Charleston County (260 St. Margaret St., Charleston)""Lowndes Grove"The Charleston ExpositionIt Happened in South Carolina"Lowndes Grove (House), Saint Margaret Street & Sixth Avenue, Charleston, Charleston County, SC(Photographs)"Plantations of the Carolina Low Countrye

            random experiment with two different functions on unit interval Announcing the arrival of Valued Associate #679: Cesar Manara Planned maintenance scheduled April 23, 2019 at 00:00UTC (8:00pm US/Eastern)Random variable and probability space notionsRandom Walk with EdgesFinding functions where the increase over a random interval is Poisson distributedNumber of days until dayCan an observed event in fact be of zero probability?Unit random processmodels of coins and uniform distributionHow to get the number of successes given $n$ trials , probability $P$ and a random variable $X$Absorbing Markov chain in a computer. Is “almost every” turned into always convergence in computer executions?Stopped random walk is not uniformly integrable

            How should I support this large drywall patch? Planned maintenance scheduled April 23, 2019 at 00:00UTC (8:00pm US/Eastern) Announcing the arrival of Valued Associate #679: Cesar Manara Unicorn Meta Zoo #1: Why another podcast?How do I cover large gaps in drywall?How do I keep drywall around a patch from crumbling?Can I glue a second layer of drywall?How to patch long strip on drywall?Large drywall patch: how to avoid bulging seams?Drywall Mesh Patch vs. Bulge? To remove or not to remove?How to fix this drywall job?Prep drywall before backsplashWhat's the best way to fix this horrible drywall patch job?Drywall patching using 3M Patch Plus Primer