The implication of zero mixed partial derivatives for multivariate function's minimizationIf the second mixed partial is identically zero, then the function can be written as a sum $f(x,y) = f_1(x) + f_2(y)$$dfracpartial^2 fpartial x partial y = 0 nRightarrow f(x,y) = g(x) + h(y)$If a nonnegative function of $x_1,dots,x_n$ can be written as $sum g_k(x_k)$, then the summands can be taken nonnegative$f(x,y) = f_1(x) + f_2(x)$ with continuous differential real-valued functionsShow that both mixed partial derivatives exist at the origin but are not equalFunction on $mathbbR^2-0$.Let $F_1,F_2:bf R^2 to R$ be functions such that…How many n-th Order Partial Derivatives Exist for a Function of k Variables?Maximize $f(textbfx,textbfy) = f_1(textbfx) + f_2(textbfx,textbfy)$Symmetry of higher order mixed partial derivatives under weaker assumptionsWhy does the partial of $f: Delta to mathbbR^2$ fail to exist?To find function satisfying given partial derivatives2 variable, 2 valued function $f(x_1,,x_2)=(x_1,,x_2)$What will be $frac partial F_i partial x_j$?
Is it possible to have a strip of cold climate in the middle of a planet?
Translation of Scottish 16th century church stained glass
List of people who lose a child in תנ"ך
What is the grammatical term for “‑ed” words like these?
We have a love-hate relationship
MAXDOP Settings for SQL Server 2014
Why does Async/Await work properly when the loop is inside the async function and not the other way around?
How do I implement a file system driver driver in Linux?
How can Trident be so inexpensive? Will it orbit Triton or just do a (slow) flyby?
Drawing ramified coverings with tikz
Should I stop contributing to retirement accounts?
Is it improper etiquette to ask your opponent what his/her rating is before the game?
How to color a curve
Indicating multiple different modes of speech (fantasy language or telepathy)
What does this horizontal bar at the first measure mean?
Why has "pence" been used in this sentence, not "pences"?
Flux received by a negative charge
What is the difference between "Do you interest" and "...interested in" something?
Is possible to search in vim history?
Folder comparison
Sampling Theorem and reconstruction
Do the concepts of IP address and network interface not belong to the same layer?
A social experiment. What is the worst that can happen?
Can someone explain how this makes sense electrically?
The implication of zero mixed partial derivatives for multivariate function's minimization
If the second mixed partial is identically zero, then the function can be written as a sum $f(x,y) = f_1(x) + f_2(y)$$dfracpartial^2 fpartial x partial y = 0 nRightarrow f(x,y) = g(x) + h(y)$If a nonnegative function of $x_1,dots,x_n$ can be written as $sum g_k(x_k)$, then the summands can be taken nonnegative$f(x,y) = f_1(x) + f_2(x)$ with continuous differential real-valued functionsShow that both mixed partial derivatives exist at the origin but are not equalFunction on $mathbbR^2-0$.Let $F_1,F_2:bf R^2 to R$ be functions such that…How many n-th Order Partial Derivatives Exist for a Function of k Variables?Maximize $f(textbfx,textbfy) = f_1(textbfx) + f_2(textbfx,textbfy)$Symmetry of higher order mixed partial derivatives under weaker assumptionsWhy does the partial of $f: Delta to mathbbR^2$ fail to exist?To find function satisfying given partial derivatives2 variable, 2 valued function $f(x_1,,x_2)=(x_1,,x_2)$What will be $frac partial F_i partial x_j$?
$begingroup$
Suppose $f(textbf x)=f(x_1,x_2) $ has mixed partial derivatives $f''_12=f''_21=0$, so can I say: there exist $f_1(x_1)$ and $f_2(x_2)$ such that $min_textbf x f(textbf x)equiv min_x_1f_1(x_1)+ min_x_2f_2(x_2)$? Or even further, as follows:
$$f(textbf x)equiv f_1(x_1)+ f_2(x_2)$$
A positive simple case is $f(x_1,x_2)=x_1^2+x_2^3$. I can not think of any opposite cases, but I am not so sure about it and may need a proof.
multivariable-calculus optimization partial-derivative
$endgroup$
add a comment |
$begingroup$
Suppose $f(textbf x)=f(x_1,x_2) $ has mixed partial derivatives $f''_12=f''_21=0$, so can I say: there exist $f_1(x_1)$ and $f_2(x_2)$ such that $min_textbf x f(textbf x)equiv min_x_1f_1(x_1)+ min_x_2f_2(x_2)$? Or even further, as follows:
$$f(textbf x)equiv f_1(x_1)+ f_2(x_2)$$
A positive simple case is $f(x_1,x_2)=x_1^2+x_2^3$. I can not think of any opposite cases, but I am not so sure about it and may need a proof.
multivariable-calculus optimization partial-derivative
$endgroup$
$begingroup$
Concerning the passage from $f(textbf x)equiv f_1(x_1)+ f_2(x_2)$ to $min_textbf x f(textbf x) = min_x_1f_1(x_1)+ min_x_2f_2(x_2)$, see this question.
$endgroup$
– user147263
Dec 29 '15 at 0:24
add a comment |
$begingroup$
Suppose $f(textbf x)=f(x_1,x_2) $ has mixed partial derivatives $f''_12=f''_21=0$, so can I say: there exist $f_1(x_1)$ and $f_2(x_2)$ such that $min_textbf x f(textbf x)equiv min_x_1f_1(x_1)+ min_x_2f_2(x_2)$? Or even further, as follows:
$$f(textbf x)equiv f_1(x_1)+ f_2(x_2)$$
A positive simple case is $f(x_1,x_2)=x_1^2+x_2^3$. I can not think of any opposite cases, but I am not so sure about it and may need a proof.
multivariable-calculus optimization partial-derivative
$endgroup$
Suppose $f(textbf x)=f(x_1,x_2) $ has mixed partial derivatives $f''_12=f''_21=0$, so can I say: there exist $f_1(x_1)$ and $f_2(x_2)$ such that $min_textbf x f(textbf x)equiv min_x_1f_1(x_1)+ min_x_2f_2(x_2)$? Or even further, as follows:
$$f(textbf x)equiv f_1(x_1)+ f_2(x_2)$$
A positive simple case is $f(x_1,x_2)=x_1^2+x_2^3$. I can not think of any opposite cases, but I am not so sure about it and may need a proof.
multivariable-calculus optimization partial-derivative
multivariable-calculus optimization partial-derivative
edited Nov 1 '15 at 20:04
user147263
asked Jun 24 '13 at 22:43
jorter.jijorter.ji
258
258
$begingroup$
Concerning the passage from $f(textbf x)equiv f_1(x_1)+ f_2(x_2)$ to $min_textbf x f(textbf x) = min_x_1f_1(x_1)+ min_x_2f_2(x_2)$, see this question.
$endgroup$
– user147263
Dec 29 '15 at 0:24
add a comment |
$begingroup$
Concerning the passage from $f(textbf x)equiv f_1(x_1)+ f_2(x_2)$ to $min_textbf x f(textbf x) = min_x_1f_1(x_1)+ min_x_2f_2(x_2)$, see this question.
$endgroup$
– user147263
Dec 29 '15 at 0:24
$begingroup$
Concerning the passage from $f(textbf x)equiv f_1(x_1)+ f_2(x_2)$ to $min_textbf x f(textbf x) = min_x_1f_1(x_1)+ min_x_2f_2(x_2)$, see this question.
$endgroup$
– user147263
Dec 29 '15 at 0:24
$begingroup$
Concerning the passage from $f(textbf x)equiv f_1(x_1)+ f_2(x_2)$ to $min_textbf x f(textbf x) = min_x_1f_1(x_1)+ min_x_2f_2(x_2)$, see this question.
$endgroup$
– user147263
Dec 29 '15 at 0:24
add a comment |
2 Answers
2
active
oldest
votes
$begingroup$
For a mixed derivative $f_xy = 0$, integrating with respect to $y$ gives:
$$
f_x(x,y) = int f_xy ,dy + h(x).
$$
Integrating with respect to $x$:
$$
f(x,y) = iint f_xy ,dydx + int h(x)dx + g(y).
$$
Similar result yields if we start from $f_yx$, now this implies
$$
f(x,y) = f_1(x) + f_2(y),
$$
and there goes your conclusion in the question.
$endgroup$
$begingroup$
Hi again. Can I say $f_1$ and $f_2$ are unique? Excluding constants.
$endgroup$
– jorter.ji
Jun 26 '13 at 6:13
$begingroup$
@jorter.ji If you only know that the mixed derivative is zero, then any differentiable function $f_1$ and $f_2$ will do. So not unique unless you have other conditions.
$endgroup$
– Shuhao Cao
Jun 26 '13 at 12:32
$begingroup$
Like what kind of conditions? Or can you think of any instances such that $f=f_1+f_2=f_3+f_4$?
$endgroup$
– jorter.ji
Jun 26 '13 at 16:49
$begingroup$
@jorter.ji I meant to say that provided only this differential relation $f_xy = f_yx = 0$ is known, then there are infinitely choices for $g$ and $h$ in $f = g(x) + h(y)$, $f$ can be $f = x+y$, or $f= x^2+y^2$. Not that they are equal or something.
$endgroup$
– Shuhao Cao
Jun 26 '13 at 16:52
$begingroup$
OIC your meaning. Actually, I already know that $f(x,y)=f_1(x)+f_2(y)+V(x,y)$, where $V(x,y)$ is a implicit but determined function, i.e., $f(x,y)$ is somehow implicitly determined, then I wonder if $f=g(x)+h(y)$ is unique.
$endgroup$
– jorter.ji
Jun 26 '13 at 17:41
|
show 5 more comments
$begingroup$
The answer of @Shuhao Cao needs an assumption that the first partial derivative is integrable.
Here I try to provide a proof without that assumption.
Restatement
I can restate the conjecture with little weaker conditions:
If $f(x, y)$ has $f_yx = 0$, then $z(x, y) = f(x) + g(y)$.
Proof
From Mean Value Theorem, $$f_y(x, y) = f_y(0, y) + f_yx(xi , y) x, xi in (0, x)$$This implies $f_y = f_y(0, y)$.
$forall x_0$, $f(x_0, y)$ is an antiderivative of $f_y(0, y)$. Any two antiderivatives differ by constant. So, we can write that $$f(x, y) = f(0, y) + c(x) = f(0, y) + f(x, 0) - f(0, 0) = f_1(x) + f_2(y)$$
Annotation
The key is that "any two antiderivatives differ by constant" can be proved only based on Mean Value Theorem, but nothing wih Reimann Integral.
$endgroup$
add a comment |
Your Answer
StackExchange.ifUsing("editor", function ()
return StackExchange.using("mathjaxEditing", function ()
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
);
);
, "mathjax-editing");
StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "69"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);
else
createEditor();
);
function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);
);
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f428662%2fthe-implication-of-zero-mixed-partial-derivatives-for-multivariate-functions-mi%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
For a mixed derivative $f_xy = 0$, integrating with respect to $y$ gives:
$$
f_x(x,y) = int f_xy ,dy + h(x).
$$
Integrating with respect to $x$:
$$
f(x,y) = iint f_xy ,dydx + int h(x)dx + g(y).
$$
Similar result yields if we start from $f_yx$, now this implies
$$
f(x,y) = f_1(x) + f_2(y),
$$
and there goes your conclusion in the question.
$endgroup$
$begingroup$
Hi again. Can I say $f_1$ and $f_2$ are unique? Excluding constants.
$endgroup$
– jorter.ji
Jun 26 '13 at 6:13
$begingroup$
@jorter.ji If you only know that the mixed derivative is zero, then any differentiable function $f_1$ and $f_2$ will do. So not unique unless you have other conditions.
$endgroup$
– Shuhao Cao
Jun 26 '13 at 12:32
$begingroup$
Like what kind of conditions? Or can you think of any instances such that $f=f_1+f_2=f_3+f_4$?
$endgroup$
– jorter.ji
Jun 26 '13 at 16:49
$begingroup$
@jorter.ji I meant to say that provided only this differential relation $f_xy = f_yx = 0$ is known, then there are infinitely choices for $g$ and $h$ in $f = g(x) + h(y)$, $f$ can be $f = x+y$, or $f= x^2+y^2$. Not that they are equal or something.
$endgroup$
– Shuhao Cao
Jun 26 '13 at 16:52
$begingroup$
OIC your meaning. Actually, I already know that $f(x,y)=f_1(x)+f_2(y)+V(x,y)$, where $V(x,y)$ is a implicit but determined function, i.e., $f(x,y)$ is somehow implicitly determined, then I wonder if $f=g(x)+h(y)$ is unique.
$endgroup$
– jorter.ji
Jun 26 '13 at 17:41
|
show 5 more comments
$begingroup$
For a mixed derivative $f_xy = 0$, integrating with respect to $y$ gives:
$$
f_x(x,y) = int f_xy ,dy + h(x).
$$
Integrating with respect to $x$:
$$
f(x,y) = iint f_xy ,dydx + int h(x)dx + g(y).
$$
Similar result yields if we start from $f_yx$, now this implies
$$
f(x,y) = f_1(x) + f_2(y),
$$
and there goes your conclusion in the question.
$endgroup$
$begingroup$
Hi again. Can I say $f_1$ and $f_2$ are unique? Excluding constants.
$endgroup$
– jorter.ji
Jun 26 '13 at 6:13
$begingroup$
@jorter.ji If you only know that the mixed derivative is zero, then any differentiable function $f_1$ and $f_2$ will do. So not unique unless you have other conditions.
$endgroup$
– Shuhao Cao
Jun 26 '13 at 12:32
$begingroup$
Like what kind of conditions? Or can you think of any instances such that $f=f_1+f_2=f_3+f_4$?
$endgroup$
– jorter.ji
Jun 26 '13 at 16:49
$begingroup$
@jorter.ji I meant to say that provided only this differential relation $f_xy = f_yx = 0$ is known, then there are infinitely choices for $g$ and $h$ in $f = g(x) + h(y)$, $f$ can be $f = x+y$, or $f= x^2+y^2$. Not that they are equal or something.
$endgroup$
– Shuhao Cao
Jun 26 '13 at 16:52
$begingroup$
OIC your meaning. Actually, I already know that $f(x,y)=f_1(x)+f_2(y)+V(x,y)$, where $V(x,y)$ is a implicit but determined function, i.e., $f(x,y)$ is somehow implicitly determined, then I wonder if $f=g(x)+h(y)$ is unique.
$endgroup$
– jorter.ji
Jun 26 '13 at 17:41
|
show 5 more comments
$begingroup$
For a mixed derivative $f_xy = 0$, integrating with respect to $y$ gives:
$$
f_x(x,y) = int f_xy ,dy + h(x).
$$
Integrating with respect to $x$:
$$
f(x,y) = iint f_xy ,dydx + int h(x)dx + g(y).
$$
Similar result yields if we start from $f_yx$, now this implies
$$
f(x,y) = f_1(x) + f_2(y),
$$
and there goes your conclusion in the question.
$endgroup$
For a mixed derivative $f_xy = 0$, integrating with respect to $y$ gives:
$$
f_x(x,y) = int f_xy ,dy + h(x).
$$
Integrating with respect to $x$:
$$
f(x,y) = iint f_xy ,dydx + int h(x)dx + g(y).
$$
Similar result yields if we start from $f_yx$, now this implies
$$
f(x,y) = f_1(x) + f_2(y),
$$
and there goes your conclusion in the question.
answered Jun 24 '13 at 23:14
Shuhao CaoShuhao Cao
16.1k34292
16.1k34292
$begingroup$
Hi again. Can I say $f_1$ and $f_2$ are unique? Excluding constants.
$endgroup$
– jorter.ji
Jun 26 '13 at 6:13
$begingroup$
@jorter.ji If you only know that the mixed derivative is zero, then any differentiable function $f_1$ and $f_2$ will do. So not unique unless you have other conditions.
$endgroup$
– Shuhao Cao
Jun 26 '13 at 12:32
$begingroup$
Like what kind of conditions? Or can you think of any instances such that $f=f_1+f_2=f_3+f_4$?
$endgroup$
– jorter.ji
Jun 26 '13 at 16:49
$begingroup$
@jorter.ji I meant to say that provided only this differential relation $f_xy = f_yx = 0$ is known, then there are infinitely choices for $g$ and $h$ in $f = g(x) + h(y)$, $f$ can be $f = x+y$, or $f= x^2+y^2$. Not that they are equal or something.
$endgroup$
– Shuhao Cao
Jun 26 '13 at 16:52
$begingroup$
OIC your meaning. Actually, I already know that $f(x,y)=f_1(x)+f_2(y)+V(x,y)$, where $V(x,y)$ is a implicit but determined function, i.e., $f(x,y)$ is somehow implicitly determined, then I wonder if $f=g(x)+h(y)$ is unique.
$endgroup$
– jorter.ji
Jun 26 '13 at 17:41
|
show 5 more comments
$begingroup$
Hi again. Can I say $f_1$ and $f_2$ are unique? Excluding constants.
$endgroup$
– jorter.ji
Jun 26 '13 at 6:13
$begingroup$
@jorter.ji If you only know that the mixed derivative is zero, then any differentiable function $f_1$ and $f_2$ will do. So not unique unless you have other conditions.
$endgroup$
– Shuhao Cao
Jun 26 '13 at 12:32
$begingroup$
Like what kind of conditions? Or can you think of any instances such that $f=f_1+f_2=f_3+f_4$?
$endgroup$
– jorter.ji
Jun 26 '13 at 16:49
$begingroup$
@jorter.ji I meant to say that provided only this differential relation $f_xy = f_yx = 0$ is known, then there are infinitely choices for $g$ and $h$ in $f = g(x) + h(y)$, $f$ can be $f = x+y$, or $f= x^2+y^2$. Not that they are equal or something.
$endgroup$
– Shuhao Cao
Jun 26 '13 at 16:52
$begingroup$
OIC your meaning. Actually, I already know that $f(x,y)=f_1(x)+f_2(y)+V(x,y)$, where $V(x,y)$ is a implicit but determined function, i.e., $f(x,y)$ is somehow implicitly determined, then I wonder if $f=g(x)+h(y)$ is unique.
$endgroup$
– jorter.ji
Jun 26 '13 at 17:41
$begingroup$
Hi again. Can I say $f_1$ and $f_2$ are unique? Excluding constants.
$endgroup$
– jorter.ji
Jun 26 '13 at 6:13
$begingroup$
Hi again. Can I say $f_1$ and $f_2$ are unique? Excluding constants.
$endgroup$
– jorter.ji
Jun 26 '13 at 6:13
$begingroup$
@jorter.ji If you only know that the mixed derivative is zero, then any differentiable function $f_1$ and $f_2$ will do. So not unique unless you have other conditions.
$endgroup$
– Shuhao Cao
Jun 26 '13 at 12:32
$begingroup$
@jorter.ji If you only know that the mixed derivative is zero, then any differentiable function $f_1$ and $f_2$ will do. So not unique unless you have other conditions.
$endgroup$
– Shuhao Cao
Jun 26 '13 at 12:32
$begingroup$
Like what kind of conditions? Or can you think of any instances such that $f=f_1+f_2=f_3+f_4$?
$endgroup$
– jorter.ji
Jun 26 '13 at 16:49
$begingroup$
Like what kind of conditions? Or can you think of any instances such that $f=f_1+f_2=f_3+f_4$?
$endgroup$
– jorter.ji
Jun 26 '13 at 16:49
$begingroup$
@jorter.ji I meant to say that provided only this differential relation $f_xy = f_yx = 0$ is known, then there are infinitely choices for $g$ and $h$ in $f = g(x) + h(y)$, $f$ can be $f = x+y$, or $f= x^2+y^2$. Not that they are equal or something.
$endgroup$
– Shuhao Cao
Jun 26 '13 at 16:52
$begingroup$
@jorter.ji I meant to say that provided only this differential relation $f_xy = f_yx = 0$ is known, then there are infinitely choices for $g$ and $h$ in $f = g(x) + h(y)$, $f$ can be $f = x+y$, or $f= x^2+y^2$. Not that they are equal or something.
$endgroup$
– Shuhao Cao
Jun 26 '13 at 16:52
$begingroup$
OIC your meaning. Actually, I already know that $f(x,y)=f_1(x)+f_2(y)+V(x,y)$, where $V(x,y)$ is a implicit but determined function, i.e., $f(x,y)$ is somehow implicitly determined, then I wonder if $f=g(x)+h(y)$ is unique.
$endgroup$
– jorter.ji
Jun 26 '13 at 17:41
$begingroup$
OIC your meaning. Actually, I already know that $f(x,y)=f_1(x)+f_2(y)+V(x,y)$, where $V(x,y)$ is a implicit but determined function, i.e., $f(x,y)$ is somehow implicitly determined, then I wonder if $f=g(x)+h(y)$ is unique.
$endgroup$
– jorter.ji
Jun 26 '13 at 17:41
|
show 5 more comments
$begingroup$
The answer of @Shuhao Cao needs an assumption that the first partial derivative is integrable.
Here I try to provide a proof without that assumption.
Restatement
I can restate the conjecture with little weaker conditions:
If $f(x, y)$ has $f_yx = 0$, then $z(x, y) = f(x) + g(y)$.
Proof
From Mean Value Theorem, $$f_y(x, y) = f_y(0, y) + f_yx(xi , y) x, xi in (0, x)$$This implies $f_y = f_y(0, y)$.
$forall x_0$, $f(x_0, y)$ is an antiderivative of $f_y(0, y)$. Any two antiderivatives differ by constant. So, we can write that $$f(x, y) = f(0, y) + c(x) = f(0, y) + f(x, 0) - f(0, 0) = f_1(x) + f_2(y)$$
Annotation
The key is that "any two antiderivatives differ by constant" can be proved only based on Mean Value Theorem, but nothing wih Reimann Integral.
$endgroup$
add a comment |
$begingroup$
The answer of @Shuhao Cao needs an assumption that the first partial derivative is integrable.
Here I try to provide a proof without that assumption.
Restatement
I can restate the conjecture with little weaker conditions:
If $f(x, y)$ has $f_yx = 0$, then $z(x, y) = f(x) + g(y)$.
Proof
From Mean Value Theorem, $$f_y(x, y) = f_y(0, y) + f_yx(xi , y) x, xi in (0, x)$$This implies $f_y = f_y(0, y)$.
$forall x_0$, $f(x_0, y)$ is an antiderivative of $f_y(0, y)$. Any two antiderivatives differ by constant. So, we can write that $$f(x, y) = f(0, y) + c(x) = f(0, y) + f(x, 0) - f(0, 0) = f_1(x) + f_2(y)$$
Annotation
The key is that "any two antiderivatives differ by constant" can be proved only based on Mean Value Theorem, but nothing wih Reimann Integral.
$endgroup$
add a comment |
$begingroup$
The answer of @Shuhao Cao needs an assumption that the first partial derivative is integrable.
Here I try to provide a proof without that assumption.
Restatement
I can restate the conjecture with little weaker conditions:
If $f(x, y)$ has $f_yx = 0$, then $z(x, y) = f(x) + g(y)$.
Proof
From Mean Value Theorem, $$f_y(x, y) = f_y(0, y) + f_yx(xi , y) x, xi in (0, x)$$This implies $f_y = f_y(0, y)$.
$forall x_0$, $f(x_0, y)$ is an antiderivative of $f_y(0, y)$. Any two antiderivatives differ by constant. So, we can write that $$f(x, y) = f(0, y) + c(x) = f(0, y) + f(x, 0) - f(0, 0) = f_1(x) + f_2(y)$$
Annotation
The key is that "any two antiderivatives differ by constant" can be proved only based on Mean Value Theorem, but nothing wih Reimann Integral.
$endgroup$
The answer of @Shuhao Cao needs an assumption that the first partial derivative is integrable.
Here I try to provide a proof without that assumption.
Restatement
I can restate the conjecture with little weaker conditions:
If $f(x, y)$ has $f_yx = 0$, then $z(x, y) = f(x) + g(y)$.
Proof
From Mean Value Theorem, $$f_y(x, y) = f_y(0, y) + f_yx(xi , y) x, xi in (0, x)$$This implies $f_y = f_y(0, y)$.
$forall x_0$, $f(x_0, y)$ is an antiderivative of $f_y(0, y)$. Any two antiderivatives differ by constant. So, we can write that $$f(x, y) = f(0, y) + c(x) = f(0, y) + f(x, 0) - f(0, 0) = f_1(x) + f_2(y)$$
Annotation
The key is that "any two antiderivatives differ by constant" can be proved only based on Mean Value Theorem, but nothing wih Reimann Integral.
answered Mar 16 at 8:52
TA123TA123
958
958
add a comment |
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f428662%2fthe-implication-of-zero-mixed-partial-derivatives-for-multivariate-functions-mi%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
$begingroup$
Concerning the passage from $f(textbf x)equiv f_1(x_1)+ f_2(x_2)$ to $min_textbf x f(textbf x) = min_x_1f_1(x_1)+ min_x_2f_2(x_2)$, see this question.
$endgroup$
– user147263
Dec 29 '15 at 0:24