Do we treat x and y as independent when differentiating f with respect to y?

Click For Summary
SUMMARY

The discussion centers on the differentiation of the function f(x,y) = x² + y² with respect to t, where x and y are defined as functions of t (x = cos(t), y = sin(t)). Participants clarify that while differentiating, x and y can be treated as independent variables despite being functions of t. The application of the chain rule is emphasized, particularly in the context of partial derivatives, leading to the conclusion that the partial derivatives ∂f/∂x = 2x and ∂f/∂y = 2y remain valid. The confusion arises from the interplay between treating x and y as independent variables and their dependence on t.

PREREQUISITES
  • Understanding of partial derivatives and their notation
  • Familiarity with the chain rule in calculus
  • Knowledge of functions of multiple variables
  • Basic understanding of functional derivatives
NEXT STEPS
  • Study the application of the chain rule in multivariable calculus
  • Learn about functional derivatives and their implications
  • Explore the concept of treating dependent variables as independent in calculus
  • Investigate the relationship between parametric equations and their derivatives
USEFUL FOR

Mathematicians, calculus students, and anyone involved in advanced mathematical analysis, particularly those working with multivariable functions and differentiation techniques.

adamg
Messages
48
Reaction score
0
if you are given f(x,y)=x^2+y^2 and y=cos(t) x=sin(t), then when you differentiate f with respect to t, you use the partial derivatives of f with respect to x and y in the process. When i was taught partial derivatives, i was told that we "keep all but one of the independent variables fixed...". Now in this case, when differentiating f with respect to y, say, i don't see how this works. For, x (=cos(t) ) cannot be fixed while y (=sin(t) ) varies, can it?

When we differentiate f do we 'forget' that x and y are functions of t, and treat them as independent?
 
Physics news on Phys.org
f(x,y) = x(t)^2 + y(t)^2

f(x,y)_x = 2x(t)x'(t)

f(x,y)_y = 2y(t)y'(t)


You keep one constant becuse you want to find the rate of change of f(x,y) as x changes, not y. Same thing for finding y.
 
Last edited:
When i was taught partial derivatives, i was told that we "keep all but one of the independent variables fixed...". Now in this case, when differentiating f with respect to y, say, i don't see how this works. For, x (=cos(t) ) cannot be fixed while y (=sin(t) ) varies, can it?
The independent variables for F, are x and y. t has no concern here. Why would one not be able to be fixed while the others are moving, this is the exact same thing you do when you do partial derivatives anyway.

When we differentiate f do we 'forget' that x and y are functions of t, and treat them as independent?
Exactly.
 
ok, thanks. is the partial derivative of f with respect to x just 2x then?
i don't understand the x'(t) part that you included?
 
Its an application of the chain rule, which come to think of it doesn't belong there unless your finding the partial wrt to t.
 
I don't know this well, but if you write f(x,y)=x^2+y^2...then \partial_xf(x,y)=2x...?? It's a weird question :

\frac{d}{dt}(\frac{\partial f}{\partial x}(x,y))=\frac{d2x(t)}{dt}=2x'(t)

where as normally, f(x(t),y(t))=g(t) is a function of t only, hence \partial_xg(t)=0...is like if Schwarz's thm were not valid here...

In fact I would say : it depends on at which time you apply the transformation x->x(t) (hence x as variable, or x as function of a variable...)

Because you could see x and y as function : x(t), y(t), and f(x,y)=x(t)^2+y(t)^2 as a functional of x and y...

Then you could apply Gateaux derivatives in the "direction" of the function n...(n:t->n(t))...with the usual definition :

D_{x,n(t)}F(x,y)=\lim_{h->0}\frac{F(x+hn,y)-F(x,y)}{h}=\lim_{h->0}\frac{x(t)^2+2hn(t)x(t)+y(t)^2-x(t)^2-y(t)^2}{h}
=2n(t)x(t)

so that we recover whozum result by functionally deriving along n(t)=x'(t)
 
Last edited:
whozum said:
The independent variables for F, are x and y. t has no concern here. Why would one not be able to be fixed while the others are moving, this is the exact same thing you do when you do partial derivatives anyway.


.

thats what i was confused about. we treat the x and y as independent variables, even though they won't really vary independently (since both depend on t)
 
So what is your conclusion klein? What is

\frac{\delta f}{\delta x} \ and \ \frac{\delta f}{\delta t} ?
 
That's the notation for the functional derivative of "f" wrt "x" or "t".

Daniel.
 
  • #10
whozum said:
f(x,y) = x(t)^2 + y(t)^2

f(x,y)_x = 2x(t)x'(t)

f(x,y)_y = 2y(t)y'(t)


You keep one constant becuse you want to find the rate of change of f(x,y) as x changes, not y. Same thing for finding y.

This makes no sense at all: x(t)2+ y(t)2 is a function of t, not x and y, and so cannot be equal to f(x,y).
Even worse is f(x,y)_x = 2x(t)x'(t) and f(x,y)_y = 2y(t)y'(t). If f is a function of t, then it makes no sense to write "fx(x,y).

What IS true is that if f(x,y)= x2+ y2, then fx(x,y)= 2x and fy(x,y)= 2y.
IF, further, x and y are themselves functions of t, then f is really a function of t, f(t), and f '(t)= (2x)x'(t)+ (2y)y'(t).
 
  • #11
what about if you had f(x,y) = x^2+y^2 again, but with y a function of x. The for the partial derivative of f with respect to x, you keep y fixed (?) and let x vary, even though y is a function of x (?). Thanks.
 
  • #12
IF, further, x and y are themselves functions of t, then f is really a function of t, f(t), and f '(t)= (2x)x'(t)+ (2y)y'(t).

So f_x(x,y) doesn't really exist? What you described is what I did but just assuming that you could differentiate withrespect to each variable.
 
  • #13
What whozum did is the following :

f(x,y)=x(t)^2+y(t)^2...(which by the way is a functional but not a function)

Then you thought, ok I want to differentiate with respect to x...hence I should move x a bit..the problem is that x is a function, hence I don't vary with another function, but the parameter of this function : t

so you did :

\lim_{h->0}\frac{f(x(t+h),y(t))-f(x(t),y(t))}{h}=\lim_{h->0}\frac{x(t+h)^2-x(t)^2}{h}\approx\lim_{h->0}\frac{(x(t)+hx'(t)+...)^2-x(t)^2}{h}=2x(t)x'(t)

But you could eventually vary x with a function n, instead of varying the variable of x...
 
  • #14
Somehow, I doubt whozum was thinking along the lines of functionals here.
whozum:
Let x,y be given as functions x=X(t), y=Y(t)
What is now correct is that we may define a function F(t)=f(X(t),Y(t))
Then, we have by the chain rule:
\frac{dF}{dt}=\frac{\partial{f}}{\partial{x}}\mid_{(x,y)=(X(t),Y(t))}}\frac{dX}{dt}+\frac{\partial{f}}{\partial{y}}\mid_{(x,y)=(X(t),Y(t))}}\frac{dY}{dt}
As HallsofIvy has already said.
 
Last edited:
  • #15
arildno said:
Somehow, I doubt whozum was thinking along the lines of functionals here.
whozum:
Let x,y be given as functions x=X(t), y=Y(t)
What is now correct is that we may define a function F(t)=f(X(t),Y(t))
Then, we have by the chain rule:
\frac{dF}{dt}=\frac{\partial{f}}{\partial{x}}\mid_{(x,y)=(X(t),Y(t))}}\frac{dX}{dt}+\frac{\partial{f}}{\partial{y}}\mid_{(x,y)=(X(t),Y(t))}}\frac{dY}{dt}
As HallsofIvy has already said.

Thanks for doubting me ;) Its justified though.

You are correct, and I understand what HallsofIvy said and what you are saying, but the question presnted by the OP was the last question I posted two/three posts ago, this is the only question I have left.

I'm also trying to think of a similar instance where the indep. variable is a function of another variable, I will let you know.

Thanks.
 

Similar threads

  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 6 ·
Replies
6
Views
3K
  • · Replies 9 ·
Replies
9
Views
3K
  • · Replies 16 ·
Replies
16
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 8 ·
Replies
8
Views
938
  • · Replies 11 ·
Replies
11
Views
2K