Boorglar
- 210
- 10
Solving linear differential equations by "factoring"
I have thought of an interesting way of solving n-th order linear differential equations (with constant coefficients) by imitating the way we solve n-th order polynomials, that is by "factoring" it into a "product" of simple 1st order linear differential equations. Does this method already exist? I haven't been able to prove it rigorously, but the case n=3 and 2 convinced me that it is reasonable to say it's true.
We have the n-th order linear diff. eq.:
a_{n}y^{(n)}(x)+a_{n-1}y^{(n-1)}(x)+ ... +a_{0}y(x) = f(x)
or in terms of the differential operator: (a_nD^n+a_{n-1}D^{n-1}+ ... + a_0)[y(x)] = f(x)
My claim is that any such equation can be "factored" into
a_n(D-\lambda_n)(D-\lambda_{n-1})...(D-\lambda_1)[y(x)] = f(x)
where D is the differential operator, and \lambda_{i} is the i-th root of the polynomial equation a_{n}x^{n}+a_{n-1}x^{n-1}+ ... + a_{0} = 0
and the "multiplication" is actually the successive composition of the D-\lambda_i operations, which take the derivative of the argument and then substract \lambda_i times the argument.
Then by naming the function resulting from the i-th operation \phi_i(x) we have the equation a_n\phi_n(x) = f(x) or a_n\phi_{n-1}^'(x) - a_n\lambda_n\phi_{n-1}(x) = f(x) This is a simple 1st order linear diff. eq solvable with integrating factors, and after solving it we repeat the process until we get to the last equation for y. Each time we replace the value of \phi_i into the next equation for \phi_{i-1} and solve a 1-st order eq. repetitively.Can the "factoring" part be proven rigorously, in a similar way that it is proven for polynomials with the Fundamental Theorem of Algebra? I know I haven't given any proof for this but it seems to be true. Is this method already known? Is it any better or worse than the traditional way such equations are solved?
I have thought of an interesting way of solving n-th order linear differential equations (with constant coefficients) by imitating the way we solve n-th order polynomials, that is by "factoring" it into a "product" of simple 1st order linear differential equations. Does this method already exist? I haven't been able to prove it rigorously, but the case n=3 and 2 convinced me that it is reasonable to say it's true.
We have the n-th order linear diff. eq.:
a_{n}y^{(n)}(x)+a_{n-1}y^{(n-1)}(x)+ ... +a_{0}y(x) = f(x)
or in terms of the differential operator: (a_nD^n+a_{n-1}D^{n-1}+ ... + a_0)[y(x)] = f(x)
My claim is that any such equation can be "factored" into
a_n(D-\lambda_n)(D-\lambda_{n-1})...(D-\lambda_1)[y(x)] = f(x)
where D is the differential operator, and \lambda_{i} is the i-th root of the polynomial equation a_{n}x^{n}+a_{n-1}x^{n-1}+ ... + a_{0} = 0
and the "multiplication" is actually the successive composition of the D-\lambda_i operations, which take the derivative of the argument and then substract \lambda_i times the argument.
Then by naming the function resulting from the i-th operation \phi_i(x) we have the equation a_n\phi_n(x) = f(x) or a_n\phi_{n-1}^'(x) - a_n\lambda_n\phi_{n-1}(x) = f(x) This is a simple 1st order linear diff. eq solvable with integrating factors, and after solving it we repeat the process until we get to the last equation for y. Each time we replace the value of \phi_i into the next equation for \phi_{i-1} and solve a 1-st order eq. repetitively.Can the "factoring" part be proven rigorously, in a similar way that it is proven for polynomials with the Fundamental Theorem of Algebra? I know I haven't given any proof for this but it seems to be true. Is this method already known? Is it any better or worse than the traditional way such equations are solved?
Last edited: