- #1
- 19
- 0
I am in a problem seminar class and I have not taken Linear Algebra in over 4 years so I am having a lot of problems with this. Please help...
Let P be the set of all polynomials with real coefficients and of degree less than 3. Thus,
P = {f:f(x)= a(sub0) +a(sub1)x +a(sub2)x^2, a(sub i ) is in the reals}
P is a vector spaces over the field of Reals under the usual opperations of addition and scalar multiplication of polynomials.
Let V = {f element of P : f(-2) =f(1)}
Find a basis for V .
Not sure
I know that to be a basis, that the set must be linearly independent and span V.
I proved in the first part of this problem that V is a subspace of P
I also said that in order for f(-2) = f(1),
a(sub0) - 2a(sub1) +4a(sub2) = a(sub0) + a(sub1) + a(sub2)
-3a(sub1) + 3a(sub2) so
a(sub1) = a(sub2)
not sure if this is right or not...
so I randomly chose 2 different elements of this set. and showed that they were linearly independent.
the elements I chose were v(sub 1) (x)= 1 + 3x +3x^2 and v(sub 2 ) (x) = 0 +1x + 1x^2
Now I found a theorem in my linear algebra book that says;
"Let H be a subspace of a finite-dimentional vector space V. Any linearly independent set in H can be expanded, if necessary to a basis for H. Also H is finit dimensional and dim H < or = dim V "
Now I know that dim P = 3
Thus dim V < or = 3. and since the set of two numbers I used above were linearly independent, they should be able to be expanded to form the basis, but my text does not tell me how to do this...
Am I going about this wrong? I don't really understand how to find span V either.
Please point me in the right direction!
Homework Statement
Let P be the set of all polynomials with real coefficients and of degree less than 3. Thus,
P = {f:f(x)= a(sub0) +a(sub1)x +a(sub2)x^2, a(sub i ) is in the reals}
P is a vector spaces over the field of Reals under the usual opperations of addition and scalar multiplication of polynomials.
Let V = {f element of P : f(-2) =f(1)}
Find a basis for V .
Homework Equations
Not sure
I know that to be a basis, that the set must be linearly independent and span V.
The Attempt at a Solution
I proved in the first part of this problem that V is a subspace of P
I also said that in order for f(-2) = f(1),
a(sub0) - 2a(sub1) +4a(sub2) = a(sub0) + a(sub1) + a(sub2)
-3a(sub1) + 3a(sub2) so
a(sub1) = a(sub2)
not sure if this is right or not...
so I randomly chose 2 different elements of this set. and showed that they were linearly independent.
the elements I chose were v(sub 1) (x)= 1 + 3x +3x^2 and v(sub 2 ) (x) = 0 +1x + 1x^2
Now I found a theorem in my linear algebra book that says;
"Let H be a subspace of a finite-dimentional vector space V. Any linearly independent set in H can be expanded, if necessary to a basis for H. Also H is finit dimensional and dim H < or = dim V "
Now I know that dim P = 3
Thus dim V < or = 3. and since the set of two numbers I used above were linearly independent, they should be able to be expanded to form the basis, but my text does not tell me how to do this...
Am I going about this wrong? I don't really understand how to find span V either.
Please point me in the right direction!
Last edited: