Linear Algebra homework problem

Sampson12
Messages
2
Reaction score
0
The set F of all functions from R to R is a vector space with the usual operations of addition of functions and scalar multiplication. Is the set of solutions to the differential equation f''(x)+3f'(x)+(x^2)f(x)=0 a subspace of F? Justify your answer


I know that to prove that the set of solutions is a subspace of F I need to show that the set not empty, is closed under addition and closed under scalar multiplication. The only problem I have is solving the differential equation which i am not sure how to do because solving this kind of differential equation (I only know how to solve second order DE's with constant coefficients) has not brought up in the current course (Linear mathematics 2 year maths) or any of the prerequisite coursess . Do I actually need to solve the equation to find the answer or is their another way to find if its a subspace of F or not? Any help would be very much appreciated.
 
Physics news on Phys.org
The problem does NOT ask you to solve the equation. Yes, you must show
1) the set is not empty. Equivalently show that it contains the "0" vector. Is f(x)= 0 a solution for this equation?

2) the set is closed under addition. If f and g are solutions, is f+ g as solution? Just put f+ g into the equation and try to separate f and g.

3) the set is closed under scalar multiplication. If f is a solution and a is a number, is af a solution. Just put af into the equation and try to factor out a.
 
if f(x) and g(x) satisfy your equation, you should be able to show that f(x)+g(x) also satisfies it. same goes for a*f(x). Finally you need to show that a solution exists; there exists theorems which state this, but it's easy to find a particular function which solves this equation (hint: existence of this solution follows directly from either of above conditions)
 
Thanks for the help
 
There are two things I don't understand about this problem. First, when finding the nth root of a number, there should in theory be n solutions. However, the formula produces n+1 roots. Here is how. The first root is simply ##\left(r\right)^{\left(\frac{1}{n}\right)}##. Then you multiply this first root by n additional expressions given by the formula, as you go through k=0,1,...n-1. So you end up with n+1 roots, which cannot be correct. Let me illustrate what I mean. For this...
Back
Top