General Questions on Polynomials, Vectors, and Matrices

  • Context: Graduate 
  • Thread starter Thread starter mrxtothaz
  • Start date Start date
  • Tags Tags
    General
Click For Summary
SUMMARY

This discussion focuses on the application of polynomials, vectors, and matrices in linear algebra. The basis of the 3-dimensional space is defined by the polynomials {x² + 3x - 2, 2x² + 5x - 3, -x² - 4x + 4}, and the participants explore methods to demonstrate that these vectors span the space using matrix representation. Key concepts include the geometric interpretation of vector equations, the conditions for invertibility in transformations, and the implications of row reduction for determining rank and nullity. The consensus is that vector spaces can only be isomorphic when the transformation matrix is square.

PREREQUISITES
  • Understanding of polynomial functions and their representation as vectors
  • Familiarity with matrix operations and row reduction techniques
  • Knowledge of linear transformations and their properties
  • Concept of vector spaces and isomorphisms
NEXT STEPS
  • Study the process of proving vector span using matrix representation
  • Learn about the geometric interpretation of vector equations in 3D space
  • Explore the conditions for invertibility of transformation matrices
  • Investigate the fundamental theorem of linear algebra regarding rank and nullity
USEFUL FOR

Students and professionals in mathematics, particularly those studying linear algebra, vector spaces, and matrix theory. This discussion is beneficial for anyone looking to deepen their understanding of polynomial representations and linear transformations.

mrxtothaz
Messages
14
Reaction score
0
I'm doing a bit of review and have a few brief questions.

1) Say you have 3 polynomials that generates a 3dimensional space. Let this basis be {x2 + 3x -2, 2x2 +5x -3, -x2 -4x +4}. To prove that these vectors span the space, I want to show that any vector in this space can be expressed using these basis elements. I did this at the beginning of the year before learning matrices; I have attempted to do this with the use of matrices and for some reason I can't figure it out.

My approach has basically been to put the coefficients of each degree 2 term in a row and solve for a, and then doing the same thing for the terms of different degree (in terms of b and c). What am I doing wrong?

2) To find the equation of a line passing through two points, say: (3, -2, 4) and (-5, 7, 1), why is it that this is done by subtracting them, to get some parameter t? The equation ends up being v=(3,-2,4) +t(-5,7,1), where I presume t = (-5,7,1) - (3,-2,4)=(-8,9,-3). But I don't understand the geometric reason behind this.

3) Is there such a thing as an invertible mxn transformation? Or does this concept apply only to square matrices? In my reading, I have come across the definition that an isomorphism between vector spaces (say S and T) is when you have an invertible map between them such that both ST = I and TS = I. Is this only the case for square matrices?

4) Before being introduced to linear transformations being represented by matrices, I used to think of vectors as being row vectors. Then I learned it was the other way around. Something I have recently read has brought a new question to mind. Basically, what I read was that the correspondence of a vector x with [x]B (coordinate vector of x relative to B) provides a linear transformation from V to Fn. So am I correct to interpret this as saying that you can consider the elements of some vector space as row vectors, but with respect to some basis (putting them w.r.t to a basis is a transformation), they end up becoming column vectors?

5) By using row reduction, you can find out the rank and nullity of a given transformation. I have read that you can tell this both by looking at the columns or the rows (the maximum number of linearly independent rows OR columns). Do both these techniques work with row reduction?

I would appreciate any help. Thanks in advance.
 
Physics news on Phys.org
  1. Write the polynomials in your basis as vectors, so you have {u, v, w} (for example u = (1, 3, -2)). Now put them in a matrix [u v w]. Show this matrix is invertible (reducible to the identity matrix)
  2. Instead of a 3-dimensional example you might want to look at a 2-dimensional example. Drawing them could help too.
  3. Yes, vector-spaces can only be isomorphic when the transformation matrix is square.
  4. Well, vectors only become a row or a column when you write them down on paper. In fact, vectors are elements of a vector-space and do not come with an instruction to write them down in a predefined way. However when doing linear algebra it is common to write elements x from Rn or Cn in the following way

    \left[ \begin {array}{c} x_{1}\\\noalign{\medskip}\cdot\\\noalign{\medskip}\cdot\\\noalign{\medskip}x_{n}\end {array} \right]​

    You can write them (x_{1}, \dots, x_{n}) as well or as rows but the common way is as columns like I just did.
  5. Yes, by row reduction you can easily find the rank of an m \times n matrix A. To find the Nullspace you have to do a little bit more work. There is a useful theorem (maybe the fundamental theorem of linear algebra) which states

    \mbox{rank}{A} + \dim\mbox{Nul}{A} = n​
 
Last edited:

Similar threads

  • · Replies 6 ·
Replies
6
Views
4K
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 12 ·
Replies
12
Views
2K
  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 14 ·
Replies
14
Views
3K
  • · Replies 1 ·
Replies
1
Views
4K
  • · Replies 27 ·
Replies
27
Views
3K
  • · Replies 0 ·
Replies
0
Views
8K
  • · Replies 4 ·
Replies
4
Views
4K