B Can you perform algebra on derivatives?

INTP_ty
Messages
26
Reaction score
0
Question 1:

Consider the numbers 2 & 8. The average between these two quantities is 5, hence 2+8=10, 10/2=5. Now consider two arbitrary derivatives. It wouldn't make much sense to find the average between two unrelated derivatives, but suppose that f(x,y) was a function of both x & y. Now would it make sense to find the average between f'(x) & f'(y) to get f'(x,y)?

I watched a YouTube video on finding derivatives with functions that contain more than one variable. The tutorial had you hold one variable constant to get the derivative of the other. It was a partial derivative. I'm just wondering why this is necessary & why you can't just be general about it & solve for f'(x,y)? I think I might know why. Suppose f(x,y)=x²+y³. You can't simply add the derivatives of those two variables together. That would be like adding two fractions together who's denominators are different. You can of course get around this by finding the lowest common denominator. I'm wondering if you can do the same, but with functions? Perhaps you could "stretch" x² or "contract" y³ so you can add them together?
Question 2:

In question 1, I mentioned that in the video I watched, I noticed that you couldn't solve for f'(x,y) & that you could only solve for the derivative of just one variable. Is this always the case? What if the derivatives of both the variables were the same? Suppose f(x,y)=x²+y². f'(x,y) is not 2x+2y? I see no reason why that wouldn't work.
 
Mathematics news on Phys.org
INTP_ty said:
I watched a YouTube video on finding derivatives with functions that contain more than one variable. The tutorial had you hold one variable constant to get the derivative of the other. It was a partial derivative.

INTP_ty said:
I'm just wondering why this is necessary & why you can't just be general about it & solve for f'(x,y)?

When one differentiates partially, one is assuming everything else is constant in the functional relation.
In a way, you're basically saying that you only care about what's going on in the particular direction.
If i differentiate partially with respect to x, I only care about the rate of change in the x direction. Likewise, if i differentiate partially with respect to y, i only care about the change in the y direction.
but the question remains ...why we do that?
Mathematical tools ..though looking abstract...are used to depict physical situation and help analysis of physical problems ...pertaining to sciences in general and partials carry special meanings.

take the case of a gas filled up baloon - its volume V depends on pressure P, as well as temperature T so V( P, T) so any analysis of the gas confined in V can give us info as to how it changes with pressure if temp. is kept constant and so on ..these partials may be related to other thermodynamic variables/functions.
 
INTP_ty said:
I watched a YouTube video on finding derivatives with functions that contain more than one variable. The tutorial had you hold one variable constant to get the derivative of the other. It was a partial derivative. I'm just wondering why this is necessary & why you can't just be general about it & solve for f'(x,y)? I think I might know why. Suppose f(x,y)=x²+y³. You can't simply add the derivatives of those two variables together. That would be like adding two fractions together who's denominators are different. You can of course get around this by finding the lowest common denominator. I'm wondering if you can do the same, but with functions? Perhaps you could "stretch" x² or "contract" y³ so you can add them together?

The partial derivatives with respect to ##x## and ##y## are, in fact, special cases of the "directional derivative". That is, the rate of change of the function in any direction in the ##xy## plane. See, for example:

http://tutorial.math.lamar.edu/Classes/CalcIII/DirectionalDeriv.aspx

The average of ##\frac{\partial f}{\partial x}## and ##\frac{\partial f}{\partial y}## doesn't really tell you much.
 
What do you mean by f'(x,y)? If you want it to be a real number, what does that real number represent? f is like a two-dimensional surface if you plot it, you cannot express its "tilt" by a single number as you can do in the one-dimensional case.
 
If you consider (multivariate) differentiable functions, it is considering their tangent space. It is like laying a board on their graph with contact at some point. This means the partial derivatives are a coordinate system of this board. So you can add them as you add vectors. They are the basis vectors of the coordinate system. You get other directions by adding weighted sums of them. I found the Wiki page on it a bit bumpy, but if want to have a look: https://en.wikipedia.org/wiki/Differential_of_a_function.

As this tangent space is simply a vector space, you can do a lot of algebra on it. In fact, it is a very important tool to derive properties of the function by easier to handle (linear) methods. In a small neighborhood at the point where the tangent space has contact to the function's graph, they are almost identical!

Your formula in the second question is correct, beside you left out the basis vectors. So
$$f'(x,y) = df = d (x^2+y^2) = 2x \frac{\partial f}{\partial x} + 2y \frac{\partial f}{\partial y} = (2x , 2y)$$
 


See 4:17

fresh_42 got it. It's vectors that I was getting at.
 
Insights auto threads is broken atm, so I'm manually creating these for new Insight articles. In Dirac’s Principles of Quantum Mechanics published in 1930 he introduced a “convenient notation” he referred to as a “delta function” which he treated as a continuum analog to the discrete Kronecker delta. The Kronecker delta is simply the indexed components of the identity operator in matrix algebra Source: https://www.physicsforums.com/insights/what-exactly-is-diracs-delta-function/ by...
Fermat's Last Theorem has long been one of the most famous mathematical problems, and is now one of the most famous theorems. It simply states that the equation $$ a^n+b^n=c^n $$ has no solutions with positive integers if ##n>2.## It was named after Pierre de Fermat (1607-1665). The problem itself stems from the book Arithmetica by Diophantus of Alexandria. It gained popularity because Fermat noted in his copy "Cubum autem in duos cubos, aut quadratoquadratum in duos quadratoquadratos, et...
I'm interested to know whether the equation $$1 = 2 - \frac{1}{2 - \frac{1}{2 - \cdots}}$$ is true or not. It can be shown easily that if the continued fraction converges, it cannot converge to anything else than 1. It seems that if the continued fraction converges, the convergence is very slow. The apparent slowness of the convergence makes it difficult to estimate the presence of true convergence numerically. At the moment I don't know whether this converges or not.
Back
Top