Bases of functions and Matrices with respect to the bases

Click For Summary

Discussion Overview

The discussion revolves around the concepts of vector spaces, specifically the polynomial vector space $P_2$, and the determination of bases for this space. Participants explore the definitions of bases, the process of proving that certain sets of polynomials are bases, and the computation of the matrix representation of the differentiation operator with respect to different bases.

Discussion Character

  • Exploratory
  • Technical explanation
  • Conceptual clarification
  • Debate/contested
  • Mathematical reasoning

Main Points Raised

  • One participant seeks assistance in proving that the sets $\beta$ and $\beta'$ are bases for the vector space $P_2$.
  • Another participant suggests that the original poster should demonstrate understanding of the definition of a basis before seeking help.
  • A participant explains that a basis must allow for unique representation of vectors in the space and discusses the conditions necessary for $\beta'$ to be a basis.
  • There is a discussion about the necessity of both conditions for a set to be considered a basis, including the ability to express every vector as a linear combination and the uniqueness of that representation.
  • Participants provide insights into how to compute the matrix of the differentiation operator with respect to the bases, including specific examples of differentiating basis vectors.
  • One participant expresses frustration at being assumed to have not consulted textbooks and seeks recommendations for resources to aid their understanding.

Areas of Agreement / Disagreement

Participants express differing views on the sufficiency of the conditions for a set to be a basis. While some agree on the necessity of both conditions, others emphasize the importance of demonstrating unique representation. The discussion remains unresolved regarding the specific proofs needed to establish the bases.

Contextual Notes

There are limitations in the discussion regarding the assumptions made about the participants' prior knowledge and the depth of explanations provided. The mathematical steps involved in proving the bases and computing the differentiation matrix are not fully resolved.

Who May Find This Useful

This discussion may be useful for students or individuals seeking to understand the concepts of vector spaces, bases, and linear transformations, particularly in the context of polynomial functions and differentiation.

Kronos1
Messages
5
Reaction score
0
Hi All struggling with concepts involved here

So I have $${P}_{2} = \left\{ a{t}^{2}+bt+c \mid a,b,c\epsilon R\right\}$$ is a real vector space with respect to the usual addition of polynomials and multiplication of a polynomial by a constant.

I need to show that both $$\beta=\left\{1,t,{t}^{2}\right\} and \space \beta^{\prime}=\left\{t,{t}^{2}+t,{t}^{2}+t+1\right\} $$ are bases for $${P}_{2}$$

Then a real polynomial $$p(t)$$ defines the differentiable function
$$p:R\to R, \space x\to p(x) $$
As shown in elementary calculus, differentiation is the linear transformation
$$D:{P}_{2} \to {P}_{2}, \space p\to p^{\prime}=\pd{p}{x}$$
Find the Matrix of $D$ with respect to the bases

(i) [Math]\beta$$ in both the domain and co-domain
(ii) [Math]\beta$$ in the domain and [Math]\beta^{\prime}$$ in the co-domain
(iii) [Math]\beta^{\prime}$$ in the domain and [Math]\beta$$ in the co-domain
(iv) [Math]\beta^{\prime}$$ in both the domain and co-domain

Any help would be appreciated as well as a detailed explanation as to why thanks in advance
 
Physics news on Phys.org
You are asking questions that are explained in any textbook of linear algebra. Yes, it is possible to duplicate a textbook and write a detailed explanation here, but what would justify this effort? Can you convince us that you have no access to textbooks?

See rule #11 http://mathhelpboards.com/rules/ (click on the "Expand" button on top). With respect to proving that the given sets of polynomials are bases, I suggest you show your effort by demonstrating that you know what a basis is. Can you prove that the given sets satisfy at least a part of that definition?

With respect to the matrix of the differentiation operator, its columns are coordinates of the images of basis vectors. See Wikipedia here. In (i), you need to take each vector from $\beta$, differentiate it and find the coordinates of the result in the same basis $\beta$. For example, $D(t^2)=2t$, so the coordinates of $Dt^2$ with respect to $\beta$ are $(0,2,0)$. This is the last column (since $t^2$ is the last basis vector) of $D$ with respect to $\beta$ and $\beta$.
 
You assume that I have not tried a textbook already. coming to this forum is a last resort as I am struggling to understand the definitions in the textbooks.

For all the reading I have done I have only ascertained that the basis is a set of vectors that you can make all other vectors in that space out of via the addition of scalar multiples of the basis vectors. I cannot quite understand how you go about proving that they are basis vectors?
 
Kronos said:
I have only ascertained that the basis is a set of vectors that you can make all other vectors in that space out of via the addition of scalar multiples of the basis vectors. I cannot quite understand how you go about proving that they are basis vectors?
Hmm, I think it should be obvious that every element $at^2+bt+c$ of $P_2$ can be obtained from $1$, $t$ and $t^2$ using addition and multiplication by scalars. As for the second basis, suppose that $a$, $b$ and $c$ are fixed and we want to express $at^2+bt+c$ through elements of $\beta'$. We must have
\[
xt+y(t^2+t)+z(t^2+t+1)=at^2+bt+c
\]
for some coefficients $x$, $y$ an $z$. Equating coefficients of the same powers of $t$, we get the following system of equations.
\[
\left\{
\begin{array}{rcl}
z&=&c\\
x+y+z&=&b\\
y+z&=&a
\end{array}
\right.
\qquad(*)
\]
Moving the first equation down converts it into echelon form, so you can determine if it has solutions.

It is important that the ability to express every vector as a linear combination of vectors from $\beta'$ does not make $\beta'$ a basis: it's only one of the two conditions on a basis. The condition you named says that $\beta'$ has sufficiently many vectors to express all vectors in the space. The second condition says that it does not have too many vectors so that expressing other vectors through $\beta'$ becomes ambiguous. Every vector must be expressed in a unique way. It is sufficient to show this only for the zero vector. We know that $0\cdot t^2+0\cdot t+0=0\cdot t+0(t^2+t)+0(t^2+t+1)$. It remains to show that no other linear combination gives the zero vector. You can determine if this is so by looking at the system (*) above where $a=b=c=0$ and figuring out how many solutions it has.
 
Evgeny.Makarov said:
It is important that the ability to express every vector as a linear combination of vectors from $\beta'$ does not make $\beta'$ a basis: it's only one of the two conditions on a basis.

Actually, since we have a finite-dimensional vector space and since $|\beta| = |\beta'|$ it is sufficient, since:

$\dim(\text{span}(\beta')) \leq |\beta'|$.

The condition you state: "the ability to express every vector as a linear combination of vectors from $\beta'$"

is equivalent to saying: $\text{span}(\beta') = P_2$, which has dimension 3, so "less than" is not an option.
 
Evgeny.Makarov said:
You are asking questions that are explained in any textbook of linear algebra. Yes, it is possible to duplicate a textbook and write a detailed explanation here, but what would justify this effort? Can you convince us that you have no access to textbooks?

See rule #11 http://mathhelpboards.com/rules/ (click on the "Expand" button on top). With respect to proving that the given sets of polynomials are bases, I suggest you show your effort by demonstrating that you know what a basis is. Can you prove that the given sets satisfy at least a part of that definition?

With respect to the matrix of the differentiation operator, its columns are coordinates of the images of basis vectors. See Wikipedia here. In (i), you need to take each vector from $\beta$, differentiate it and find the coordinates of the result in the same basis $\beta$. For example, $D(t^2)=2t$, so the coordinates of $Dt^2$ with respect to $\beta$ are $(0,2,0)$. This is the last column (since $t^2$ is the last basis vector) of $D$ with respect to $\beta$ and $\beta$.
Could you recommend one of these textbooks - I have 3 at the moment and cannot find direction from them.
 

Similar threads

  • · Replies 11 ·
Replies
11
Views
1K
  • · Replies 27 ·
Replies
27
Views
2K
  • · Replies 15 ·
Replies
15
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 4 ·
Replies
4
Views
1K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K