# Non-linear into linear

• I
• TheoEndre
In summary, the conversation explores the possibility of converting non-linear functions into a system of linear functions to make integration easier. Various approaches, such as changing variables or factoring, are discussed but ultimately it is determined that this approach may not be successful. The concept of local Euclidean and its application in differential geometry is mentioned, and there is a debate about whether it is the function itself or the change of the function that is considered linear in differentiation.f

#### TheoEndre

Hello everyone,
I've always had this question in my mind: Can we convert the non-linear function into a system of linear functions?
I don't know if this is actually something exist in math (I searched a little bit to be honest), but I'm really interested in this question because it would make integral much easier ( and probably other things).
for example:
##\int_{}^{}\left(x^2\pm{k}\right)^n## where ##n## , ##k## are integers
If we could just decrease the power of ##x^2## into the first degree, it would be much easier to find rather than using trig substitution.

All I can think of is either using a change of variable : ##x^2=u ## , but then you have ##2xdx = du ## and you need to make needed changes or factoring ## x^2 \pm k ## into ## (x+i\sqrt k )(x-i \sqrt k) ## but then the integral is not multiplicative. But then the next trick is to try Wolfram's ...;).

TheoEndre
Hello everyone,
I've always had this question in my mind: Can we convert the non-linear function into a system of linear functions?
I don't know if this is actually something exist in math (I searched a little bit to be honest), but I'm really interested in this question because it would make integral much easier ( and probably other things).
for example:
##\int_{}^{}\left(x^2\pm{k}\right)^n## where ##n## , ##k## are integers
If we could just decrease the power of ##x^2## into the first degree, it would be much easier to find rather than using trig substitution.
I don't think this will get you anywhere. Suppose the integral was ##\int (x^2 + k)^1dx##; i.e., with n in your formula set to 1. Replacing ##x^2## by ##x## gets you a very different antiderivative.

TheoEndre
All I can think of is either using a change of variable : ##x^2=u ## , but then you have ##2xdx = du ## and you need to make needed changes or factoring ## x^2 \pm k ## into ## (x+i\sqrt k )(x-i \sqrt k) ## but then the integral is not multiplicative. But then the next trick is to try Wolfram's ...;).

I don't think this will get you anywhere. Suppose the integral was ##\int (x^2 + k)^1dx##; i.e., with n in your formula set to 1. Replacing ##x^2## by ##x## gets you a very different antiderivative.

I see now the problem of this. But, aren't non-linear functions a system of linear functions on different infinitesimal intervals? When I see the graph of ##x^2##, I always think of zooming into a really small interval (I like to denote that interval ##[a,a+h]## where ##h## approaches ##0##), won't it be a line at that interval? even if it was a really small one.

I see now the problem of this. But, aren't non-linear functions a system of linear functions on different infinitesimal intervals? When I see the graph of ##x^2##, I always think of zooming into a really small interval (I like to denote that interval ##[a,a+h]## where ##h## approaches ##0##), won't it be a line at that interval? even if it was a really small one.
I think the best you can do in this regard is to approximate _ the local change of a function_ by a linear map when the function is differentiable at a point.

TheoEndre and fresh_42
I see now the problem of this. But, aren't non-linear functions a system of linear functions on different infinitesimal intervals? When I see the graph of ##x^2##, I always think of zooming into a really small interval (I like to denote that interval ##[a,a+h]## where ##h## approaches ##0##), won't it be a line at that interval? even if it was a really small one.
Yes. This property is called local Euclidean, and one of its applications are tangent spaces. It's the starting point of differential geometry.

TheoEndre
Yes. This property is called local Euclidean, and one of its applications are tangent spaces. It's the starting point of differential geometry.
Ooh! I really thank you for this. I haven't studied differential geometry yet so I didn't know it had these awesome topics. Thanks to you I have now something to answer my questions!
And thanks to @WWGD and @Mark44 for their answers, they were really helpful.

WWGD
Yes. This property is called local Euclidean, and one of its applications are tangent spaces. It's the starting point of differential geometry.
But is it the function itself or the change of the function that are considered linear?

But is it the function itself or the change of the function that are considered linear?
When it comes to differentiation, then there are so many different views on the same thing, that it's confusing. I once listed some out of curiosity, but stopped at ten without even mention the word slope. In the end it's always a directional derivative, a measure of change in a certain direction. The process itself is linear (differentiation), and the change in the sense of slope defines a linear function as approximation (##x \mapsto f\,'(a)\cdot x##) in the small neighborhood the OP mentioned. Of course the function itself doesn't change. And he already instinctively mentioned the limits of such an approach:
won't it be a line at that interval? even if it was a really small one.
i.e. the smaller the interval is, the more accurate the approximation will be, which implies never equal (except for linear functions), but the famous ##f(x)=f(a) + f\,'(a)\cdot (x-a) + o(x-a)## instead. Now we can discuss the remainder. Our prof tortured us with remainder estimations of the Taylor series in all variants (real one-dimensional, complex, real multi-dimensional) and of course I've forgotten all of them.

I also forgot to add, that this local condition doesn't make integration any easier. In the end we would find ourselves confronted with Riemannian sums again ... until someone will show up and say: "Lebesgue - forget Riemann".

TheoEndre and WWGD
When it comes to differentiation, then there are so many different views on the same thing, that it's confusing. I once listed some out of curiosity, but stopped at ten without even mention the word slope. In the end it's always a directional derivative, a measure of change in a certain direction. The process itself is linear (differentiation), and the change in the sense of slope defines a linear function as approximation (##x \mapsto f\,'(a)\cdot x##) in the small neighborhood the OP mentioned. Of course the function itself doesn't change. And he already instinctively mentioned the limits of such an approach:

i.e. the smaller the interval is, the more accurate the approximation will be, which implies never equal (except for linear functions), but the famous ##f(x)=f(a) + f\,'(a)\cdot (x-a) + o(x-a)## instead. Now we can discuss the remainder. Our prof tortured us with remainder estimations of the Taylor series in all variants (real one-dimensional, complex, real multi-dimensional) and of course I've forgotten all of them.

I also forgot to add, that this local condition doesn't make integration any easier. In the end we would find ourselves confronted with Riemannian sums again ... until someone will show up and say: "Lebesgue - forget Riemann".
I see, we can then describe the value of the function nearby thanks to the approximation given by the differential. Yes, and I agree with the confusion re all different definitions.

When it comes to differentiation, then there are so many different views on the same thing, that it's confusing. I once listed some out of curiosity, but stopped at ten without even mention the word slope. In the end it's always a directional derivative, a measure of change in a certain direction. The process itself is linear (differentiation), and the change in the sense of slope defines a linear function as approximation (##x \mapsto f\,'(a)\cdot x##) in the small neighborhood the OP mentioned. Of course the function itself doesn't change. And he already instinctively mentioned the limits of such an approach:

i.e. the smaller the interval is, the more accurate the approximation will be, which implies never equal (except for linear functions), but the famous ##f(x)=f(a) + f\,'(a)\cdot (x-a) + o(x-a)## instead. Now we can discuss the remainder. Our prof tortured us with remainder estimations of the Taylor series in all variants (real one-dimensional, complex, real multi-dimensional) and of course I've forgotten all of them.

I also forgot to add, that this local condition doesn't make integration any easier. In the end we would find ourselves confronted with Riemannian sums again ... until someone will show up and say: "Lebesgue - forget Riemann".
Ah, yes, I fell into this confusion myself: this is the function as approximated, within the tangent plane. Always fall for it.