The implication of zero mixed partial derivatives for multivariate function's minimizationIf the second mixed partial is identically zero, then the function can be written as a sum $f(x,y) = f_1(x) + f_2(y)$$dfracpartial^2 fpartial x partial y = 0 nRightarrow f(x,y) = g(x) + h(y)$If a nonnegative function of $x_1,dots,x_n$ can be written as $sum g_k(x_k)$, then the summands can be taken nonnegative$f(x,y) = f_1(x) + f_2(x)$ with continuous differential real-valued functionsShow that both mixed partial derivatives exist at the origin but are not equalFunction on $mathbbR^2-0$.Let $F_1,F_2:bf R^2 to R$ be functions such that…How many n-th Order Partial Derivatives Exist for a Function of k Variables?Maximize $f(textbfx,textbfy) = f_1(textbfx) + f_2(textbfx,textbfy)$Symmetry of higher order mixed partial derivatives under weaker assumptionsWhy does the partial of $f: Delta to mathbbR^2$ fail to exist?To find function satisfying given partial derivatives2 variable, 2 valued function $f(x_1,,x_2)=(x_1,,x_2)$What will be $frac partial F_i partial x_j$?

Is it possible to have a strip of cold climate in the middle of a planet?

Translation of Scottish 16th century church stained glass

List of people who lose a child in תנ"ך

What is the gram­mat­i­cal term for “‑ed” words like these?

We have a love-hate relationship

MAXDOP Settings for SQL Server 2014

Why does Async/Await work properly when the loop is inside the async function and not the other way around?

How do I implement a file system driver driver in Linux?

How can Trident be so inexpensive? Will it orbit Triton or just do a (slow) flyby?

Drawing ramified coverings with tikz

Should I stop contributing to retirement accounts?

Is it improper etiquette to ask your opponent what his/her rating is before the game?

How to color a curve

Indicating multiple different modes of speech (fantasy language or telepathy)

What does this horizontal bar at the first measure mean?

Why has "pence" been used in this sentence, not "pences"?

Flux received by a negative charge

What is the difference between "Do you interest" and "...interested in" something?

Is possible to search in vim history?

Folder comparison

Sampling Theorem and reconstruction

Do the concepts of IP address and network interface not belong to the same layer?

A social experiment. What is the worst that can happen?

Can someone explain how this makes sense electrically?



The implication of zero mixed partial derivatives for multivariate function's minimization


If the second mixed partial is identically zero, then the function can be written as a sum $f(x,y) = f_1(x) + f_2(y)$$dfracpartial^2 fpartial x partial y = 0 nRightarrow f(x,y) = g(x) + h(y)$If a nonnegative function of $x_1,dots,x_n$ can be written as $sum g_k(x_k)$, then the summands can be taken nonnegative$f(x,y) = f_1(x) + f_2(x)$ with continuous differential real-valued functionsShow that both mixed partial derivatives exist at the origin but are not equalFunction on $mathbbR^2-0$.Let $F_1,F_2:bf R^2 to R$ be functions such that…How many n-th Order Partial Derivatives Exist for a Function of k Variables?Maximize $f(textbfx,textbfy) = f_1(textbfx) + f_2(textbfx,textbfy)$Symmetry of higher order mixed partial derivatives under weaker assumptionsWhy does the partial of $f: Delta to mathbbR^2$ fail to exist?To find function satisfying given partial derivatives2 variable, 2 valued function $f(x_1,,x_2)=(x_1,,x_2)$What will be $frac partial F_i partial x_j$?













2












$begingroup$


Suppose $f(textbf x)=f(x_1,x_2) $ has mixed partial derivatives $f''_12=f''_21=0$, so can I say: there exist $f_1(x_1)$ and $f_2(x_2)$ such that $min_textbf x f(textbf x)equiv min_x_1f_1(x_1)+ min_x_2f_2(x_2)$? Or even further, as follows:
$$f(textbf x)equiv f_1(x_1)+ f_2(x_2)$$



A positive simple case is $f(x_1,x_2)=x_1^2+x_2^3$. I can not think of any opposite cases, but I am not so sure about it and may need a proof.










share|cite|improve this question











$endgroup$











  • $begingroup$
    Concerning the passage from $f(textbf x)equiv f_1(x_1)+ f_2(x_2)$ to $min_textbf x f(textbf x) = min_x_1f_1(x_1)+ min_x_2f_2(x_2)$, see this question.
    $endgroup$
    – user147263
    Dec 29 '15 at 0:24















2












$begingroup$


Suppose $f(textbf x)=f(x_1,x_2) $ has mixed partial derivatives $f''_12=f''_21=0$, so can I say: there exist $f_1(x_1)$ and $f_2(x_2)$ such that $min_textbf x f(textbf x)equiv min_x_1f_1(x_1)+ min_x_2f_2(x_2)$? Or even further, as follows:
$$f(textbf x)equiv f_1(x_1)+ f_2(x_2)$$



A positive simple case is $f(x_1,x_2)=x_1^2+x_2^3$. I can not think of any opposite cases, but I am not so sure about it and may need a proof.










share|cite|improve this question











$endgroup$











  • $begingroup$
    Concerning the passage from $f(textbf x)equiv f_1(x_1)+ f_2(x_2)$ to $min_textbf x f(textbf x) = min_x_1f_1(x_1)+ min_x_2f_2(x_2)$, see this question.
    $endgroup$
    – user147263
    Dec 29 '15 at 0:24













2












2








2





$begingroup$


Suppose $f(textbf x)=f(x_1,x_2) $ has mixed partial derivatives $f''_12=f''_21=0$, so can I say: there exist $f_1(x_1)$ and $f_2(x_2)$ such that $min_textbf x f(textbf x)equiv min_x_1f_1(x_1)+ min_x_2f_2(x_2)$? Or even further, as follows:
$$f(textbf x)equiv f_1(x_1)+ f_2(x_2)$$



A positive simple case is $f(x_1,x_2)=x_1^2+x_2^3$. I can not think of any opposite cases, but I am not so sure about it and may need a proof.










share|cite|improve this question











$endgroup$




Suppose $f(textbf x)=f(x_1,x_2) $ has mixed partial derivatives $f''_12=f''_21=0$, so can I say: there exist $f_1(x_1)$ and $f_2(x_2)$ such that $min_textbf x f(textbf x)equiv min_x_1f_1(x_1)+ min_x_2f_2(x_2)$? Or even further, as follows:
$$f(textbf x)equiv f_1(x_1)+ f_2(x_2)$$



A positive simple case is $f(x_1,x_2)=x_1^2+x_2^3$. I can not think of any opposite cases, but I am not so sure about it and may need a proof.







multivariable-calculus optimization partial-derivative






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Nov 1 '15 at 20:04







user147263

















asked Jun 24 '13 at 22:43









jorter.jijorter.ji

258




258











  • $begingroup$
    Concerning the passage from $f(textbf x)equiv f_1(x_1)+ f_2(x_2)$ to $min_textbf x f(textbf x) = min_x_1f_1(x_1)+ min_x_2f_2(x_2)$, see this question.
    $endgroup$
    – user147263
    Dec 29 '15 at 0:24
















  • $begingroup$
    Concerning the passage from $f(textbf x)equiv f_1(x_1)+ f_2(x_2)$ to $min_textbf x f(textbf x) = min_x_1f_1(x_1)+ min_x_2f_2(x_2)$, see this question.
    $endgroup$
    – user147263
    Dec 29 '15 at 0:24















$begingroup$
Concerning the passage from $f(textbf x)equiv f_1(x_1)+ f_2(x_2)$ to $min_textbf x f(textbf x) = min_x_1f_1(x_1)+ min_x_2f_2(x_2)$, see this question.
$endgroup$
– user147263
Dec 29 '15 at 0:24




$begingroup$
Concerning the passage from $f(textbf x)equiv f_1(x_1)+ f_2(x_2)$ to $min_textbf x f(textbf x) = min_x_1f_1(x_1)+ min_x_2f_2(x_2)$, see this question.
$endgroup$
– user147263
Dec 29 '15 at 0:24










2 Answers
2






active

oldest

votes


















1












$begingroup$

For a mixed derivative $f_xy = 0$, integrating with respect to $y$ gives:
$$
f_x(x,y) = int f_xy ,dy + h(x).
$$
Integrating with respect to $x$:
$$
f(x,y) = iint f_xy ,dydx + int h(x)dx + g(y).
$$
Similar result yields if we start from $f_yx$, now this implies
$$
f(x,y) = f_1(x) + f_2(y),
$$
and there goes your conclusion in the question.






share|cite|improve this answer









$endgroup$












  • $begingroup$
    Hi again. Can I say $f_1$ and $f_2$ are unique? Excluding constants.
    $endgroup$
    – jorter.ji
    Jun 26 '13 at 6:13











  • $begingroup$
    @jorter.ji If you only know that the mixed derivative is zero, then any differentiable function $f_1$ and $f_2$ will do. So not unique unless you have other conditions.
    $endgroup$
    – Shuhao Cao
    Jun 26 '13 at 12:32










  • $begingroup$
    Like what kind of conditions? Or can you think of any instances such that $f=f_1+f_2=f_3+f_4$?
    $endgroup$
    – jorter.ji
    Jun 26 '13 at 16:49










  • $begingroup$
    @jorter.ji I meant to say that provided only this differential relation $f_xy = f_yx = 0$ is known, then there are infinitely choices for $g$ and $h$ in $f = g(x) + h(y)$, $f$ can be $f = x+y$, or $f= x^2+y^2$. Not that they are equal or something.
    $endgroup$
    – Shuhao Cao
    Jun 26 '13 at 16:52











  • $begingroup$
    OIC your meaning. Actually, I already know that $f(x,y)=f_1(x)+f_2(y)+V(x,y)$, where $V(x,y)$ is a implicit but determined function, i.e., $f(x,y)$ is somehow implicitly determined, then I wonder if $f=g(x)+h(y)$ is unique.
    $endgroup$
    – jorter.ji
    Jun 26 '13 at 17:41


















0












$begingroup$

The answer of @Shuhao Cao needs an assumption that the first partial derivative is integrable.



Here I try to provide a proof without that assumption.



Restatement



I can restate the conjecture with little weaker conditions:



If $f(x, y)$ has $f_yx = 0$, then $z(x, y) = f(x) + g(y)$.



Proof



From Mean Value Theorem, $$f_y(x, y) = f_y(0, y) + f_yx(xi , y) x, xi in (0, x)$$This implies $f_y = f_y(0, y)$.



$forall x_0$, $f(x_0, y)$ is an antiderivative of $f_y(0, y)$. Any two antiderivatives differ by constant. So, we can write that $$f(x, y) = f(0, y) + c(x) = f(0, y) + f(x, 0) - f(0, 0) = f_1(x) + f_2(y)$$



Annotation



The key is that "any two antiderivatives differ by constant" can be proved only based on Mean Value Theorem, but nothing wih Reimann Integral.






share|cite|improve this answer









$endgroup$












    Your Answer





    StackExchange.ifUsing("editor", function ()
    return StackExchange.using("mathjaxEditing", function ()
    StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
    StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
    );
    );
    , "mathjax-editing");

    StackExchange.ready(function()
    var channelOptions =
    tags: "".split(" "),
    id: "69"
    ;
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function()
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled)
    StackExchange.using("snippets", function()
    createEditor();
    );

    else
    createEditor();

    );

    function createEditor()
    StackExchange.prepareEditor(
    heartbeatType: 'answer',
    autoActivateHeartbeat: false,
    convertImagesToLinks: true,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    imageUploader:
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    ,
    noCode: true, onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    );



    );













    draft saved

    draft discarded


















    StackExchange.ready(
    function ()
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f428662%2fthe-implication-of-zero-mixed-partial-derivatives-for-multivariate-functions-mi%23new-answer', 'question_page');

    );

    Post as a guest















    Required, but never shown

























    2 Answers
    2






    active

    oldest

    votes








    2 Answers
    2






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    1












    $begingroup$

    For a mixed derivative $f_xy = 0$, integrating with respect to $y$ gives:
    $$
    f_x(x,y) = int f_xy ,dy + h(x).
    $$
    Integrating with respect to $x$:
    $$
    f(x,y) = iint f_xy ,dydx + int h(x)dx + g(y).
    $$
    Similar result yields if we start from $f_yx$, now this implies
    $$
    f(x,y) = f_1(x) + f_2(y),
    $$
    and there goes your conclusion in the question.






    share|cite|improve this answer









    $endgroup$












    • $begingroup$
      Hi again. Can I say $f_1$ and $f_2$ are unique? Excluding constants.
      $endgroup$
      – jorter.ji
      Jun 26 '13 at 6:13











    • $begingroup$
      @jorter.ji If you only know that the mixed derivative is zero, then any differentiable function $f_1$ and $f_2$ will do. So not unique unless you have other conditions.
      $endgroup$
      – Shuhao Cao
      Jun 26 '13 at 12:32










    • $begingroup$
      Like what kind of conditions? Or can you think of any instances such that $f=f_1+f_2=f_3+f_4$?
      $endgroup$
      – jorter.ji
      Jun 26 '13 at 16:49










    • $begingroup$
      @jorter.ji I meant to say that provided only this differential relation $f_xy = f_yx = 0$ is known, then there are infinitely choices for $g$ and $h$ in $f = g(x) + h(y)$, $f$ can be $f = x+y$, or $f= x^2+y^2$. Not that they are equal or something.
      $endgroup$
      – Shuhao Cao
      Jun 26 '13 at 16:52











    • $begingroup$
      OIC your meaning. Actually, I already know that $f(x,y)=f_1(x)+f_2(y)+V(x,y)$, where $V(x,y)$ is a implicit but determined function, i.e., $f(x,y)$ is somehow implicitly determined, then I wonder if $f=g(x)+h(y)$ is unique.
      $endgroup$
      – jorter.ji
      Jun 26 '13 at 17:41















    1












    $begingroup$

    For a mixed derivative $f_xy = 0$, integrating with respect to $y$ gives:
    $$
    f_x(x,y) = int f_xy ,dy + h(x).
    $$
    Integrating with respect to $x$:
    $$
    f(x,y) = iint f_xy ,dydx + int h(x)dx + g(y).
    $$
    Similar result yields if we start from $f_yx$, now this implies
    $$
    f(x,y) = f_1(x) + f_2(y),
    $$
    and there goes your conclusion in the question.






    share|cite|improve this answer









    $endgroup$












    • $begingroup$
      Hi again. Can I say $f_1$ and $f_2$ are unique? Excluding constants.
      $endgroup$
      – jorter.ji
      Jun 26 '13 at 6:13











    • $begingroup$
      @jorter.ji If you only know that the mixed derivative is zero, then any differentiable function $f_1$ and $f_2$ will do. So not unique unless you have other conditions.
      $endgroup$
      – Shuhao Cao
      Jun 26 '13 at 12:32










    • $begingroup$
      Like what kind of conditions? Or can you think of any instances such that $f=f_1+f_2=f_3+f_4$?
      $endgroup$
      – jorter.ji
      Jun 26 '13 at 16:49










    • $begingroup$
      @jorter.ji I meant to say that provided only this differential relation $f_xy = f_yx = 0$ is known, then there are infinitely choices for $g$ and $h$ in $f = g(x) + h(y)$, $f$ can be $f = x+y$, or $f= x^2+y^2$. Not that they are equal or something.
      $endgroup$
      – Shuhao Cao
      Jun 26 '13 at 16:52











    • $begingroup$
      OIC your meaning. Actually, I already know that $f(x,y)=f_1(x)+f_2(y)+V(x,y)$, where $V(x,y)$ is a implicit but determined function, i.e., $f(x,y)$ is somehow implicitly determined, then I wonder if $f=g(x)+h(y)$ is unique.
      $endgroup$
      – jorter.ji
      Jun 26 '13 at 17:41













    1












    1








    1





    $begingroup$

    For a mixed derivative $f_xy = 0$, integrating with respect to $y$ gives:
    $$
    f_x(x,y) = int f_xy ,dy + h(x).
    $$
    Integrating with respect to $x$:
    $$
    f(x,y) = iint f_xy ,dydx + int h(x)dx + g(y).
    $$
    Similar result yields if we start from $f_yx$, now this implies
    $$
    f(x,y) = f_1(x) + f_2(y),
    $$
    and there goes your conclusion in the question.






    share|cite|improve this answer









    $endgroup$



    For a mixed derivative $f_xy = 0$, integrating with respect to $y$ gives:
    $$
    f_x(x,y) = int f_xy ,dy + h(x).
    $$
    Integrating with respect to $x$:
    $$
    f(x,y) = iint f_xy ,dydx + int h(x)dx + g(y).
    $$
    Similar result yields if we start from $f_yx$, now this implies
    $$
    f(x,y) = f_1(x) + f_2(y),
    $$
    and there goes your conclusion in the question.







    share|cite|improve this answer












    share|cite|improve this answer



    share|cite|improve this answer










    answered Jun 24 '13 at 23:14









    Shuhao CaoShuhao Cao

    16.1k34292




    16.1k34292











    • $begingroup$
      Hi again. Can I say $f_1$ and $f_2$ are unique? Excluding constants.
      $endgroup$
      – jorter.ji
      Jun 26 '13 at 6:13











    • $begingroup$
      @jorter.ji If you only know that the mixed derivative is zero, then any differentiable function $f_1$ and $f_2$ will do. So not unique unless you have other conditions.
      $endgroup$
      – Shuhao Cao
      Jun 26 '13 at 12:32










    • $begingroup$
      Like what kind of conditions? Or can you think of any instances such that $f=f_1+f_2=f_3+f_4$?
      $endgroup$
      – jorter.ji
      Jun 26 '13 at 16:49










    • $begingroup$
      @jorter.ji I meant to say that provided only this differential relation $f_xy = f_yx = 0$ is known, then there are infinitely choices for $g$ and $h$ in $f = g(x) + h(y)$, $f$ can be $f = x+y$, or $f= x^2+y^2$. Not that they are equal or something.
      $endgroup$
      – Shuhao Cao
      Jun 26 '13 at 16:52











    • $begingroup$
      OIC your meaning. Actually, I already know that $f(x,y)=f_1(x)+f_2(y)+V(x,y)$, where $V(x,y)$ is a implicit but determined function, i.e., $f(x,y)$ is somehow implicitly determined, then I wonder if $f=g(x)+h(y)$ is unique.
      $endgroup$
      – jorter.ji
      Jun 26 '13 at 17:41
















    • $begingroup$
      Hi again. Can I say $f_1$ and $f_2$ are unique? Excluding constants.
      $endgroup$
      – jorter.ji
      Jun 26 '13 at 6:13











    • $begingroup$
      @jorter.ji If you only know that the mixed derivative is zero, then any differentiable function $f_1$ and $f_2$ will do. So not unique unless you have other conditions.
      $endgroup$
      – Shuhao Cao
      Jun 26 '13 at 12:32










    • $begingroup$
      Like what kind of conditions? Or can you think of any instances such that $f=f_1+f_2=f_3+f_4$?
      $endgroup$
      – jorter.ji
      Jun 26 '13 at 16:49










    • $begingroup$
      @jorter.ji I meant to say that provided only this differential relation $f_xy = f_yx = 0$ is known, then there are infinitely choices for $g$ and $h$ in $f = g(x) + h(y)$, $f$ can be $f = x+y$, or $f= x^2+y^2$. Not that they are equal or something.
      $endgroup$
      – Shuhao Cao
      Jun 26 '13 at 16:52











    • $begingroup$
      OIC your meaning. Actually, I already know that $f(x,y)=f_1(x)+f_2(y)+V(x,y)$, where $V(x,y)$ is a implicit but determined function, i.e., $f(x,y)$ is somehow implicitly determined, then I wonder if $f=g(x)+h(y)$ is unique.
      $endgroup$
      – jorter.ji
      Jun 26 '13 at 17:41















    $begingroup$
    Hi again. Can I say $f_1$ and $f_2$ are unique? Excluding constants.
    $endgroup$
    – jorter.ji
    Jun 26 '13 at 6:13





    $begingroup$
    Hi again. Can I say $f_1$ and $f_2$ are unique? Excluding constants.
    $endgroup$
    – jorter.ji
    Jun 26 '13 at 6:13













    $begingroup$
    @jorter.ji If you only know that the mixed derivative is zero, then any differentiable function $f_1$ and $f_2$ will do. So not unique unless you have other conditions.
    $endgroup$
    – Shuhao Cao
    Jun 26 '13 at 12:32




    $begingroup$
    @jorter.ji If you only know that the mixed derivative is zero, then any differentiable function $f_1$ and $f_2$ will do. So not unique unless you have other conditions.
    $endgroup$
    – Shuhao Cao
    Jun 26 '13 at 12:32












    $begingroup$
    Like what kind of conditions? Or can you think of any instances such that $f=f_1+f_2=f_3+f_4$?
    $endgroup$
    – jorter.ji
    Jun 26 '13 at 16:49




    $begingroup$
    Like what kind of conditions? Or can you think of any instances such that $f=f_1+f_2=f_3+f_4$?
    $endgroup$
    – jorter.ji
    Jun 26 '13 at 16:49












    $begingroup$
    @jorter.ji I meant to say that provided only this differential relation $f_xy = f_yx = 0$ is known, then there are infinitely choices for $g$ and $h$ in $f = g(x) + h(y)$, $f$ can be $f = x+y$, or $f= x^2+y^2$. Not that they are equal or something.
    $endgroup$
    – Shuhao Cao
    Jun 26 '13 at 16:52





    $begingroup$
    @jorter.ji I meant to say that provided only this differential relation $f_xy = f_yx = 0$ is known, then there are infinitely choices for $g$ and $h$ in $f = g(x) + h(y)$, $f$ can be $f = x+y$, or $f= x^2+y^2$. Not that they are equal or something.
    $endgroup$
    – Shuhao Cao
    Jun 26 '13 at 16:52













    $begingroup$
    OIC your meaning. Actually, I already know that $f(x,y)=f_1(x)+f_2(y)+V(x,y)$, where $V(x,y)$ is a implicit but determined function, i.e., $f(x,y)$ is somehow implicitly determined, then I wonder if $f=g(x)+h(y)$ is unique.
    $endgroup$
    – jorter.ji
    Jun 26 '13 at 17:41




    $begingroup$
    OIC your meaning. Actually, I already know that $f(x,y)=f_1(x)+f_2(y)+V(x,y)$, where $V(x,y)$ is a implicit but determined function, i.e., $f(x,y)$ is somehow implicitly determined, then I wonder if $f=g(x)+h(y)$ is unique.
    $endgroup$
    – jorter.ji
    Jun 26 '13 at 17:41











    0












    $begingroup$

    The answer of @Shuhao Cao needs an assumption that the first partial derivative is integrable.



    Here I try to provide a proof without that assumption.



    Restatement



    I can restate the conjecture with little weaker conditions:



    If $f(x, y)$ has $f_yx = 0$, then $z(x, y) = f(x) + g(y)$.



    Proof



    From Mean Value Theorem, $$f_y(x, y) = f_y(0, y) + f_yx(xi , y) x, xi in (0, x)$$This implies $f_y = f_y(0, y)$.



    $forall x_0$, $f(x_0, y)$ is an antiderivative of $f_y(0, y)$. Any two antiderivatives differ by constant. So, we can write that $$f(x, y) = f(0, y) + c(x) = f(0, y) + f(x, 0) - f(0, 0) = f_1(x) + f_2(y)$$



    Annotation



    The key is that "any two antiderivatives differ by constant" can be proved only based on Mean Value Theorem, but nothing wih Reimann Integral.






    share|cite|improve this answer









    $endgroup$

















      0












      $begingroup$

      The answer of @Shuhao Cao needs an assumption that the first partial derivative is integrable.



      Here I try to provide a proof without that assumption.



      Restatement



      I can restate the conjecture with little weaker conditions:



      If $f(x, y)$ has $f_yx = 0$, then $z(x, y) = f(x) + g(y)$.



      Proof



      From Mean Value Theorem, $$f_y(x, y) = f_y(0, y) + f_yx(xi , y) x, xi in (0, x)$$This implies $f_y = f_y(0, y)$.



      $forall x_0$, $f(x_0, y)$ is an antiderivative of $f_y(0, y)$. Any two antiderivatives differ by constant. So, we can write that $$f(x, y) = f(0, y) + c(x) = f(0, y) + f(x, 0) - f(0, 0) = f_1(x) + f_2(y)$$



      Annotation



      The key is that "any two antiderivatives differ by constant" can be proved only based on Mean Value Theorem, but nothing wih Reimann Integral.






      share|cite|improve this answer









      $endgroup$















        0












        0








        0





        $begingroup$

        The answer of @Shuhao Cao needs an assumption that the first partial derivative is integrable.



        Here I try to provide a proof without that assumption.



        Restatement



        I can restate the conjecture with little weaker conditions:



        If $f(x, y)$ has $f_yx = 0$, then $z(x, y) = f(x) + g(y)$.



        Proof



        From Mean Value Theorem, $$f_y(x, y) = f_y(0, y) + f_yx(xi , y) x, xi in (0, x)$$This implies $f_y = f_y(0, y)$.



        $forall x_0$, $f(x_0, y)$ is an antiderivative of $f_y(0, y)$. Any two antiderivatives differ by constant. So, we can write that $$f(x, y) = f(0, y) + c(x) = f(0, y) + f(x, 0) - f(0, 0) = f_1(x) + f_2(y)$$



        Annotation



        The key is that "any two antiderivatives differ by constant" can be proved only based on Mean Value Theorem, but nothing wih Reimann Integral.






        share|cite|improve this answer









        $endgroup$



        The answer of @Shuhao Cao needs an assumption that the first partial derivative is integrable.



        Here I try to provide a proof without that assumption.



        Restatement



        I can restate the conjecture with little weaker conditions:



        If $f(x, y)$ has $f_yx = 0$, then $z(x, y) = f(x) + g(y)$.



        Proof



        From Mean Value Theorem, $$f_y(x, y) = f_y(0, y) + f_yx(xi , y) x, xi in (0, x)$$This implies $f_y = f_y(0, y)$.



        $forall x_0$, $f(x_0, y)$ is an antiderivative of $f_y(0, y)$. Any two antiderivatives differ by constant. So, we can write that $$f(x, y) = f(0, y) + c(x) = f(0, y) + f(x, 0) - f(0, 0) = f_1(x) + f_2(y)$$



        Annotation



        The key is that "any two antiderivatives differ by constant" can be proved only based on Mean Value Theorem, but nothing wih Reimann Integral.







        share|cite|improve this answer












        share|cite|improve this answer



        share|cite|improve this answer










        answered Mar 16 at 8:52









        TA123TA123

        958




        958



























            draft saved

            draft discarded
















































            Thanks for contributing an answer to Mathematics Stack Exchange!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid


            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.

            Use MathJax to format equations. MathJax reference.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function ()
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f428662%2fthe-implication-of-zero-mixed-partial-derivatives-for-multivariate-functions-mi%23new-answer', 'question_page');

            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            How should I support this large drywall patch? Planned maintenance scheduled April 23, 2019 at 00:00UTC (8:00pm US/Eastern) Announcing the arrival of Valued Associate #679: Cesar Manara Unicorn Meta Zoo #1: Why another podcast?How do I cover large gaps in drywall?How do I keep drywall around a patch from crumbling?Can I glue a second layer of drywall?How to patch long strip on drywall?Large drywall patch: how to avoid bulging seams?Drywall Mesh Patch vs. Bulge? To remove or not to remove?How to fix this drywall job?Prep drywall before backsplashWhat's the best way to fix this horrible drywall patch job?Drywall patching using 3M Patch Plus Primer

            random experiment with two different functions on unit interval Announcing the arrival of Valued Associate #679: Cesar Manara Planned maintenance scheduled April 23, 2019 at 00:00UTC (8:00pm US/Eastern)Random variable and probability space notionsRandom Walk with EdgesFinding functions where the increase over a random interval is Poisson distributedNumber of days until dayCan an observed event in fact be of zero probability?Unit random processmodels of coins and uniform distributionHow to get the number of successes given $n$ trials , probability $P$ and a random variable $X$Absorbing Markov chain in a computer. Is “almost every” turned into always convergence in computer executions?Stopped random walk is not uniformly integrable

            Lowndes Grove History Architecture References Navigation menu32°48′6″N 79°57′58″W / 32.80167°N 79.96611°W / 32.80167; -79.9661132°48′6″N 79°57′58″W / 32.80167°N 79.96611°W / 32.80167; -79.9661178002500"National Register Information System"Historic houses of South Carolina"Lowndes Grove""+32° 48' 6.00", −79° 57' 58.00""Lowndes Grove, Charleston County (260 St. Margaret St., Charleston)""Lowndes Grove"The Charleston ExpositionIt Happened in South Carolina"Lowndes Grove (House), Saint Margaret Street & Sixth Avenue, Charleston, Charleston County, SC(Photographs)"Plantations of the Carolina Low Countrye