Linear algebra vector space question

jimmycricket
Messages
115
Reaction score
2

Homework Statement


Let V=Pol_3(R) be the vector space of polynomials of degree \leq3 with real entries. Let U be the subspace of all polynomials in V of the form aX^3+(b-a)X^2+bX+(d-b) and W be the subspace of all polynomials in V of the form aX^3+bX^2+cX+d such that a+c-d=0

(i) Does U+W=Pol_3R?
(i) Does U\cap B= {0}?

Homework Equations





The Attempt at a Solution


(i) Adding U and W I get 2aX^3+(2b-a)X^2+(b+c)X+(2d-b)

Extracting the matrix and row reducing gives the 4X4 identity matrix which has dimension 4 henceU+W=Pol_3(R).
Is this reasoning correct?

(ii) I think I need to find the dimension of U\cap W but don't know how to proceed from here.
Please give as detailed an answer as possible.
 
Physics news on Phys.org
First, I'm not sure what the definition is of U + V. You seem to be implying that it is the space of the sums of the polynomials in U and V. Is that correct? We do need a definition.

If so, I don't understand what 4x4 matrix you have extracted. Could you explain? Nor do I see where you have used the conditions on a,b,c,d.

In order to show that these sums generate the whole of ##Pol_3## you have to show that any 3rd degree or less polynomial can be written as the sum of polys in U and V. If you showed that I don't understand how. Perhaps you could explain in more detail.

Re U ##\bigcap V ## you have to show which polynomials if any besides the zero polynomial are in both U and V. Do you know how to proceed with that?
 
jimmycricket said:

Homework Statement


Let V=Pol_3(R) be the vector space of polynomials of degree \leq3 with real entries. Let U be the subspace of all polynomials in V of the form aX^3+(b-a)X^2+bX+(d-b) and W be the subspace of all polynomials in V of the form aX^3+bX^2+cX+d such that a+c-d=0

(i) Does U+W=Pol_3R?
(i) Does U\cap B= {0}?

Homework Equations




The Attempt at a Solution


(i) Adding U and W I get 2aX^3+(2b-a)X^2+(b+c)X+(2d-b)

Extracting the matrix and row reducing gives the 4X4 identity matrix which has dimension 4 henceU+W=Pol_3(R).
Is this reasoning correct?

(ii) I think I need to find the dimension of U\cap W but don't know how to proceed from here.
Please give as detailed an answer as possible.

You can't add them like that. That's assuming 'a' is the first expression is the same as 'a' in the second expression, and I don't think the problem implies that. 'a' is any real number in the first expression and any other real number in the second expression. Write them in terms of row vectors. Represent X^3 as (1,0,0,0), X^2 as (0,1,0,0), X as (0,0,1,0) and 1 as (0,0,0,1). So aX^3+bX^2+cX+d=(a,b,c,d)=a(1,0,0,0)+b(0,1,0,0)+c(0,0,1,0)+d(0,0,0,1). Now deal with combinations of the basis vectors (1,0,0,0), (0,1,0,0), (0,0,1,0), (0,0,0,1) when you are arguing about span or dimension and put them into your matrices.
 
The matrix was taken from adding the two equations directly and then putting the coefficients of a in each term into the 1st column, the the coefficients of b for each term as the second column and so on i.e 2aX^3+(2b-a)X^2+(b+c)X+(2d-b) =

\begin{pmatrix}<br /> 2&amp;0&amp;0&amp;0\\<br /> -1&amp;2&amp;0&amp;0\\ <br /> 0&amp;1&amp;1&amp;0\\<br /> 0&amp;-1&amp;0&amp;2<br /> \end{pmatrix}

which reduces to
<br /> \begin{pmatrix}<br /> 1&amp;0&amp;0&amp;0\\<br /> 0&amp;1&amp;0&amp;0\\<br /> 0&amp;0&amp;1&amp;0\\<br /> 0&amp;0&amp;0&amp;1<br /> \end{pmatrix}

which spans \mathbb R^4 hence must form a basis for Pol_3X
 
Do read Dick's comments. I don't think what you are doing properly shows that you have a basis. In particular, you cannot assume that the a,b,c,d for the first equation are the same as for the second.

What you want to show is that any set of coefficients, say ##\alpha, \beta, \gamma ##,η can be a linear combination of the coefficients of the given polynomials.
 
There are two things I don't understand about this problem. First, when finding the nth root of a number, there should in theory be n solutions. However, the formula produces n+1 roots. Here is how. The first root is simply ##\left(r\right)^{\left(\frac{1}{n}\right)}##. Then you multiply this first root by n additional expressions given by the formula, as you go through k=0,1,...n-1. So you end up with n+1 roots, which cannot be correct. Let me illustrate what I mean. For this...
Back
Top