General Questions on Polynomials, Vectors, and Matrices

  • Thread starter mrxtothaz
  • Start date
  • Tags
    General
In summary, the conversation includes discussions on proving that a set of polynomials span a 3-dimensional space, finding the equation of a line passing through two points using vectors, the concept of invertible transformations and its relation to square matrices, and techniques for finding the rank and nullity of a transformation using row reduction. The conversation also touches on the representation of vectors as row or column vectors, and the fundamental theorem of linear algebra.
  • #1
mrxtothaz
14
0
I'm doing a bit of review and have a few brief questions.

1) Say you have 3 polynomials that generates a 3dimensional space. Let this basis be {x2 + 3x -2, 2x2 +5x -3, -x2 -4x +4}. To prove that these vectors span the space, I want to show that any vector in this space can be expressed using these basis elements. I did this at the beginning of the year before learning matrices; I have attempted to do this with the use of matrices and for some reason I can't figure it out.

My approach has basically been to put the coefficients of each degree 2 term in a row and solve for a, and then doing the same thing for the terms of different degree (in terms of b and c). What am I doing wrong?

2) To find the equation of a line passing through two points, say: (3, -2, 4) and (-5, 7, 1), why is it that this is done by subtracting them, to get some parameter t? The equation ends up being v=(3,-2,4) +t(-5,7,1), where I presume t = (-5,7,1) - (3,-2,4)=(-8,9,-3). But I don't understand the geometric reason behind this.

3) Is there such a thing as an invertible mxn transformation? Or does this concept apply only to square matrices? In my reading, I have come across the definition that an isomorphism between vector spaces (say S and T) is when you have an invertible map between them such that both ST = I and TS = I. Is this only the case for square matrices?

4) Before being introduced to linear transformations being represented by matrices, I used to think of vectors as being row vectors. Then I learned it was the other way around. Something I have recently read has brought a new question to mind. Basically, what I read was that the correspondence of a vector x with [x]B (coordinate vector of x relative to B) provides a linear transformation from V to Fn. So am I correct to interpret this as saying that you can consider the elements of some vector space as row vectors, but with respect to some basis (putting them w.r.t to a basis is a transformation), they end up becoming column vectors?

5) By using row reduction, you can find out the rank and nullity of a given transformation. I have read that you can tell this both by looking at the columns or the rows (the maximum number of linearly independent rows OR columns). Do both these techniques work with row reduction?

I would appreciate any help. Thanks in advance.
 
Physics news on Phys.org
  • #2
  1. Write the polynomials in your basis as vectors, so you have {u, v, w} (for example u = (1, 3, -2)). Now put them in a matrix [u v w]. Show this matrix is invertible (reducible to the identity matrix)
  2. Instead of a 3-dimensional example you might want to look at a 2-dimensional example. Drawing them could help too.
  3. Yes, vector-spaces can only be isomorphic when the transformation matrix is square.
  4. Well, vectors only become a row or a column when you write them down on paper. In fact, vectors are elements of a vector-space and do not come with an instruction to write them down in a predefined way. However when doing linear algebra it is common to write elements x from Rn or Cn in the following way

    [tex]\left[ \begin {array}{c} x_{1}\\\noalign{\medskip}\cdot\\\noalign{\medskip}\cdot\\\noalign{\medskip}x_{n}\end {array} \right][/tex]​

    You can write them [tex](x_{1}, \dots, x_{n})[/tex] as well or as rows but the common way is as columns like I just did.
  5. Yes, by row reduction you can easily find the rank of an [tex]m \times n[/tex] matrix [tex]A[/tex]. To find the Nullspace you have to do a little bit more work. There is a useful theorem (maybe the fundamental theorem of linear algebra) which states

    [tex]\mbox{rank}{A} + \dim\mbox{Nul}{A} = n[/tex]​
 
Last edited:

Related to General Questions on Polynomials, Vectors, and Matrices

1. What are polynomials?

Polynomials are mathematical expressions that consist of variables and coefficients, as well as mathematical operations such as addition, subtraction, multiplication, and non-negative integer exponents.

2. How do you add or subtract polynomials?

To add or subtract polynomials, you simply combine like terms. This means that you add or subtract the coefficients of the same variables. For example, to add 3x^2 + 5x + 2 and 2x^2 + 4x + 1, you would combine the coefficients of x^2, x, and the constants separately: (3 + 2)x^2 + (5 + 4)x + (2 + 1) = 5x^2 + 9x + 3.

3. What are vectors?

Vectors are mathematical objects that have both magnitude and direction. They can be represented graphically as arrows, with the length of the arrow representing the magnitude and the direction of the arrow indicating the direction.

4. How do you add or subtract vectors?

To add or subtract vectors, you simply add or subtract the corresponding components of the vectors. For example, if you have two vectors v = <2, 4> and w = <5, 2>, their sum v + w would be <7, 6>.

5. What are matrices?

Matrices are rectangular arrays of numbers or variables. They can be used to represent and solve systems of equations, perform transformations, and store data in computer programs.

Similar threads

  • Linear and Abstract Algebra
Replies
6
Views
910
  • Linear and Abstract Algebra
Replies
8
Views
907
  • Linear and Abstract Algebra
Replies
12
Views
1K
  • Linear and Abstract Algebra
Replies
4
Views
911
Replies
27
Views
1K
  • Linear and Abstract Algebra
Replies
2
Views
946
  • Linear and Abstract Algebra
Replies
4
Views
2K
  • Linear and Abstract Algebra
Replies
14
Views
2K
  • Linear and Abstract Algebra
Replies
1
Views
926
  • Linear and Abstract Algebra
Replies
9
Views
613
Back
Top