Linear Algebra -- Is this a basis?

Click For Summary
The discussion focuses on determining whether two sets of polynomials are bases for the space P_2(R), which consists of polynomials of degree less than or equal to 2. For both sets, linear independence and spanning the space are required criteria. The first set (1+2x+x^2, 3+x^2, x+x^2) was analyzed using row echelon form, leading to confusion about its basis status due to differing interpretations of P_2(R). The second set (-1+2x+4x^2, 3-4x-10x^2, -2-5x-6x^2) was set up differently, highlighting that column representation is preferable for clarity in linear independence checks. Ultimately, both sets can be considered bases for P_2(R) if defined correctly as polynomials of degree ≤ 2.
RJLiberator
Gold Member
Messages
1,094
Reaction score
63

Homework Statement



Determine if the following sets are bases for P_2(R)

b) (1+2x+x^2, 3+x^2,x+x^2)
d) (-1+2x+4x^2, 3-4x-10x^2,-2-5x-6x^2)

Homework Equations


Bases IF Linear Independence AND span(Set)=P_2(R)
RREF = Reduced Row Echelon Form

The Attempt at a Solution



My first question here regards an understanding of notation.

So for each B and D I did worked it out to be in Row Echelon Form and found linear independce. Permitting I did the calculations correctly, is it safe to say that these are basis for P_2(R) as the amount of terms in the set is 3 and since 3-1 = 2 P_2 is safe and these sets span P_2(R)?

Second question: For b) I was able to get the RREF rather easily from the matrix:
\begin{pmatrix}
1 & 2 & 1 & 0\\
1 & 0 & 3 & 0\\
1 & 1 & 0 & 0
\end{pmatrix}

This should be linear independent in RREF, and so it is a bases. An answer key says "no" this is not a bases. However, the answer key can be wrong. Is there anything I did wrong here in setting this problem up for b) or perhaps my understanding of span?Third question: For d) I was checking a solution and the solution had set up the matrix differently then what I had expected. They set up the matrix as such:
\begin{pmatrix}
-1 & 3 & -2 & 0\\
2 & -4 & -5 & 0\\
4 & -10 & -6 & 0
\end{pmatrix}
As you can see, since d) (-1+2x+4x^2, 3-4x-10x^2,-2-5x-6x^2), they switched the columns and rows that I'm traditionally used to. Is this ok to do? The normal way I do it, like my example in b, is difficult to get into RREF, however this way is rather easy.
 
Last edited:
Physics news on Phys.org
In reponse to your final question: "Is this a basis?" is correct. Basis - singular, bases - plural.
Also, the word is "grammar."
RJLiberator said:

Homework Statement



Determine if the following sets are bases for P_2(R)

b) (1+2x+x^2, 3+x^2,x+x^2)
d) (-1+2x+4x^2, 3-4x-10x^2,-2-5x-6x^2)

Homework Equations


Bases IF Linear Independence AND span(Set)=P_2(R)
RREF = Reduced Row Echelon Form

The Attempt at a Solution



My first question here regards an understanding of notation.

So for each B and D I did worked it out to be in Row Echelon Form and found linear independce. Permitting I did the calculations correctly, is it safe to say that these are basis for P_2(R) as the amount of terms in the set is 3 and since 3-1 = 2 P_2 is safe and these sets span P_2(R)?
It's not clear what you're asking. From the context of the problem, I take it that P2 is the space of polynomials of degree ##\le## 2. Some textbooks define this as polynomials of degree < 2.
For a set of functions to be a basis for some space, the set (1) has to be linearly independent and (2) has to span the space. You haven't said what P2 means in your book, so I can't say whether the sets are linearly independent.
RJLiberator said:
Second question: For b) I was able to get the RREF rather easily from the matrix:
\begin{pmatrix}
1 & 2 & 1 & 0\\
1 & 0 & 3 & 0\\
1 & 1 & 0 & 0
\end{pmatrix}
This isn't the way I would do it, for two reasons.
1) Each function in the set, treated as a vector, should appear as a column in the matrix, not a row. For what you did, it doesn't make much difference, as you are essentially working with a 3 x 3 matrix. If this matrix is row-reducible to the identity matrix, so will be its transpose.
2) There is no reason to have that fourth column of 0s. None of the row operations will cause it to change.
RJLiberator said:
This should be linear independent in RREF, and so it is a bases.
It is a basis.
RJLiberator said:
An answer key says "no" this is not a bases. However, the answer key can be wrong. Is there anything I did wrong here in setting this problem up for b) or perhaps my understanding of span?
Again, how is P2 defined? If it's the space of polynomials of degree < 2, then you have too many functions in your set for the set to be linearly independent.
RJLiberator said:
Third question: For d) I was checking a solution and the solution had set up the matrix differently then what I had expected. They set up the matrix as such:
\begin{pmatrix}
-1 & 3 & -2 & 0\\
2 & -4 & -5 & 0\\
4 & -10 & -6 & 0
\end{pmatrix}
This is how I would set it up, with the coefficients of the polynomials as columns, and with the elements in a column being the coefficients of 1, x, x2, in that order. I would not have the fourth column, though.
RJLiberator said:
As you can see, since d) (-1+2x+4x^2, 3-4x-10x^2,-2-5x-6x^2), they switched the columns and rows that I'm traditionally used to. Is this ok to do?
Not only is the way they did it OK -- it's better. When you set up an equation for determining whether a set of vectors/functions is linearly independent, the equation looks like
$$c_1\vec{v_1} + c_2\vec{v_2} + \dots + c_n\vec{v_n} = \vec{0}$$
The vectors in this equation are column vectors. Expanding the above as a matrix product, you get the following
$$\begin{bmatrix} v_{11} & v_{21} & \dots & v_{n1} \\
v_{12} & v_{22} & \dots & v_{n2} \\
\dots & \dots & \dots & \dots \\
v_{1m} & v_{2m} & \dots & v_{nm} \end{bmatrix} \begin{bmatrix} c_1 \\ c_2 \\ \dots \\ c_n \end{bmatrix} = \vec{0}$$

The first column contains the coordinates of the first vector/polynomial, and continues in the same manner.
RJLiberator said:
The normal way I do it, like my example in b, is difficult to get into RREF, however this way is rather easy.

EDIT: I know my title is improper grammer. Is this a bases* should be the title.
 
  • Like
Likes RJLiberator
*bow* *clap*
You may have just solved every mystery I had with these two problems for the past few days.

To summarize.
1) Grammar* :p. It is bases for plural, basis for singular.
2) I need to be able to define P_2 (R). This should mean the space of polynomials of degree ≤2.
3) Each function in a set is treated like a vector. This means that it should be represented as a column. While what I did in part b worked out, it isn't exactly the 'correct' way of operating with matrices and I should fix this understanding before I move forward.
4) Eliminate the 0 column, saves time.

So based on my definition of P_2(R) being ≤2 then I can say that both b and d are bases for P_2(R).
If the definition was actually <2, then both would not be bases for P_2(R).
 
RJLiberator said:
2) I need to be able to define P_2 (R). This should mean the space of polynomials of degree ≤2.
You don't need to define it -- it should be defined in your book. That said, since the polynomials include quadratics, I'm guessing that your book is defining P2 as the space of polynomials of degree ≤ 2. Since the three functions are linearly independent, and there are three of them, the minimum number for a spanning set, then the set must be a basis for P2.
 
  • Like
Likes RJLiberator
Thank you for all of your clarification here.
 
Mark44 said:
In reponse to your final question: "Is this a basis?" is correct. Basis - singular, bases - plural.
Also, the word is "grammar."
It's not clear what you're asking. From the context of the problem, I take it that P2 is the space of polynomials of degree ##\le## 2. Some textbooks define this as polynomials of degree < 2.
For a set of functions to be a basis for some space, the set (1) has to be linearly independent and (2) has to span the space. You haven't said what P2 means in your book, so I can't say whether the sets are linearly independent.
This isn't the way I would do it, for two reasons.
1) Each function in the set, treated as a vector, should appear as a column in the matrix, not a row. For what you did, it doesn't make much difference, as you are essentially working with a 3 x 3 matrix. If this matrix is row-reducible to the identity matrix, so will be its transpose.
2) There is no reason to have that fourth column of 0s. None of the row operations will cause it to change.

There is nothing wrong with having a row for the coefficients of a polynomial, with separate rows for different polynomials. In fact, I have seen that done in some books, and it is the way I myself would do it. That would, for example, make row operations on the matrix (to get new rows) correspond to taking linear combinations of the polynomials (to get new polynomials). It depends on whether you prefer row or column operations, but should make absolutely no theoretical or practical difference.

As to that pesky fourth column: I see it as a hindrance rather than a help; for example, it prevents taking a determinant, which is one quick way of checking linear independence.
 
  • Like
Likes RJLiberator
Personally, I don't like changing to matrix form for problems like this. That smacks too much of "memorizing formulas" rather than actuallyunderstanding what you are doing. A set of vectors is a basis for a vector space if and only if every vector in the space can be written, in a unique way, as a linear combination of the vectors. Here, the vector space is the set of all second degree polynomials with real coefficients so any "vector" can be written in the form ax^2+ bx+ c for real number, a, b, and c, with the usual addition of polynomials and multiplication by real numbers as the operations. So the question becomes, can we find real number, \alpha, \beta, and \gamma such that \alpha(1+ 2x+ x^2)+ \beta(3+ x^2)+ \gamma(x+ x^2)= ax^2+ bx+ c for any a, b, c (spans the space) and, if so, is that solution unique (is linear independent)?

Multiplying the left side out, \alpha+ 2\alpha x+ \alpha x^2+ 3\beta+ \beta x^2+ \gamma x+ \gamma x^2= ax^2+ bx+ c.
Combining like terms, (\alpha + \beta+ \gamma)x^2+ (2\alpha+ \gamma)x+ (\alpha+ 3\beta)= ax^2+ bx+ c.

Since those have to be equal for all x, corresponding coefficients must be equal:
\alpha+ \beta+ \gamma= a
2\alpha+ \gamma= b and
\alpha+ 3\beta= c
(That corresponding coefficients must be equal follows by, for example, letting x= 0 so the constant terms must be equal, canceling those terms, dividing both sides by x, then letting x= 0 again to show that the coefficients of x must be equal, etc. This also uses the fact that polynomials are continuous functions since "dividing by 0 and then letting x= 0" really involves a limit. This is equivalent to 1, x, x^2 being a basis for this space.)

From \alpha+ \gamma= b we get \gamma= b- \alpha. From \alpha+ 3\beta= c we get \beta= (c- \alpha)/3. Putting those into \alpha+ \beta+ \gamma= a, we have \alpha+ (c- \alpha)/3+ (b- \alpha)= (-1/3)\alpha+ c/3+ b= a so that \alpha= -3(a- b- c/3). That is a unique value for all a, b, c. Now go back to \beta= (c- \alpha)/3 and \gamma= b- \alpha to determine the unique values of \beta and \gamma.
 
  • Like
Likes RJLiberator
Interesting perspective here.

I think you might have made one mistake.

From α+γ=b we get γ=b−α.

Shouldnt this be: From 2α+γ=b we get γ=b−2α.

so then:
α+(c−α)/3+(b−2α) = a
so 3c+b=a
 
HallsOfIvy said:
Personally, I don't like changing to matrix form for problems like this. That smacks too much of "memorizing formulas" rather than actuallyunderstanding what you are doing.
That's a good point, and is the reason I mentioned setting up an equation with a linear combination of vectors in a previous post. Understanding why you are row-reducing a matrix is very important.
HallsOfIvy said:
A set of vectors is a basis for a vector space if and only if every vector in the space can be written, in a unique way, as a linear combination of the vectors. Here, the vector space is the set of all second degree polynomials with real coefficients so any "vector" can be written in the form ax^2+ bx+ c for real number, a, b, and c, with the usual addition of polynomials and multiplication by real numbers as the operations. So the question becomes, can we find real number, \alpha, \beta, and \gamma such that \alpha(1+ 2x+ x^2)+ \beta(3+ x^2)+ \gamma(x+ x^2)= ax^2+ bx+ c for any a, b, c (spans the space) and, if so, is that solution unique (is linear independent)?
I don't have any problems with someone working with vectors in ##\mathbb{R}^3## (in this case) instead of quadratic functions, provided that they understand that these spaces are isomorphic.
 
  • Like
Likes RJLiberator
  • #10
RJLiberator said:
Interesting perspective here.

I think you might have made one mistake.
Shouldnt this be: From 2α+γ=b we get γ=b−2α.

so then:
α+(c−α)/3+(b−2α) = a
so 3c+b=a
It was bad enough when I couldn't read my own hand writing- now I can't read my own typing!
 
  • Like
Likes RJLiberator
  • #11
Edit: Ugh, nevermind, looked at it wrong.

Not sure how deep of a proof you need to do on the said matter.
The system of vectors forms a basis if it's linearly independent and spans ##P_2##.

If you show only a trivial linear combination produces ##k_1b_1 + k_2b_2 + k_3b_3 = 0## (the ##k##-s have to be 0 ##\Leftrightarrow## trivial linear combination) then the vectors are linearly independent. If we form a 3x3 matrix from these vectors then the matrix is regular and can therefore be transformed into an identity matrix via elementary row operations.

Any regular 3x3 matrix spans the said space.
 
Last edited:
  • Like
Likes RJLiberator

Similar threads

  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
Replies
4
Views
2K
Replies
9
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 3 ·
Replies
3
Views
3K
Replies
8
Views
2K