Linear independent vector space

In summary, the set of polynomials of degree < 1 with a leading coefficient of 1 does not constitute a vector space because it does not satisfy the two properties required for a subspace: closure under vector addition and scalar multiplication. Additionally, the set is reduced to a single "vector" by the further requirement of a leading coefficient of 1.
  • #1
realcomfy
12
0
I have a quick question about vector spaces.

Consider the vector space of all polynomials of degree < 1. If the leading coefficient (the number that multiplies [tex]x^{N-1}[/tex]) is 1, does the set still constitute a vector space?

I am thinking that it doesn't because the coefficient multiplying [tex]x^{N-1}[/tex] is the same as the coefficient multiplying [tex]x^{0} = 1[/tex], and then it would not be linearly independent, or something like that, but I am not totally sure about this. Any clarification would be greatly appreciated.
 
Physics news on Phys.org
  • #2
realcomfy said:
I have a quick question about vector spaces.

Consider the vector space of all polynomials of degree < 1. If the leading coefficient (the number that multiplies [tex]x^{N-1}[/tex]) is 1, does the set still constitute a vector space?

I am thinking that it doesn't because the coefficient multiplying [tex]x^{N-1}[/tex] is the same as the coefficient multiplying [tex]x^{0} = 1[/tex], and then it would not be linearly independent, or something like that, but I am not totally sure about this. Any clarification would be greatly appreciated.
No, it is not a matter of being "linearly independent"- that is a property of a basis for a subspace, not the subspace itself. Polynomials in a subspace of polynomial can have the same coefficient for different powers. There is nothing wrong with that.

A subspace must have two properties:
a) It is closed under vector addition.
b) It is closed under scalar multiplication.

It should be easy to see that neither of those is satified by a set of polynomials with leading coefficient 1. If you add two such polynomials, you get a polynomial with leading coefficient 2, not 1. If you multiply such a polynomial by the number "a", you get a polynomial with leading coefficient "a", not 1.

By the way, did you really mean "the vector space of all polynomials of degree < 1"? That is the set of all constant functions and further requiring that "the leading coefficient is 1" reduces it to a single "vector"!
 

What is a linear independent vector space?

A linear independent vector space is a mathematical concept that describes a collection of vectors that are not dependent on each other. This means that no vector in the space can be written as a linear combination of the other vectors in the space. In simpler terms, the vectors in a linear independent vector space are not redundant and are necessary to describe the space.

What are the requirements for a set of vectors to be linearly independent?

A set of vectors is considered linearly independent if none of the vectors can be written as a combination of the other vectors using scalar multiplication and addition. In other words, the only way to obtain a vector in the set is by using that specific vector and no other vectors in the set.

How is linear independence related to linear dependence?

Linear independence and linear dependence are two sides of the same coin. While a set of vectors is linearly independent if none of the vectors can be written as a linear combination of the others, a set of vectors is linearly dependent if at least one vector can be written as a linear combination of the others. In other words, if a set of vectors is not linearly independent, it is linearly dependent.

Why is linear independence important in mathematics and science?

Linear independence is a fundamental concept in mathematics and science because it allows us to describe and analyze vector spaces and systems of equations. The property of linear independence plays a crucial role in determining the solutions to linear systems, and it is also used in many other areas of mathematics, such as linear algebra and differential equations.

How can I determine if a set of vectors is linearly independent?

To determine if a set of vectors is linearly independent, you can use a few different methods. One method is to calculate the determinant of the matrix formed by the vectors, and if the determinant is non-zero, the vectors are linearly independent. Another method is to use the definition of linear independence and try to find a non-trivial solution to the equation a1v1 + a2v2 + ... + anvn = 0, where a1, a2, ..., an are scalars and v1, v2, ..., vn are the vectors in the set. If a non-trivial solution exists, the vectors are linearly dependent, and if no non-trivial solution exists, the vectors are linearly independent.

Similar threads

  • Calculus and Beyond Homework Help
Replies
0
Views
449
  • Calculus and Beyond Homework Help
Replies
8
Views
795
  • Calculus and Beyond Homework Help
Replies
2
Views
985
Replies
1
Views
571
  • Calculus and Beyond Homework Help
Replies
14
Views
594
  • Calculus and Beyond Homework Help
Replies
9
Views
1K
  • Calculus and Beyond Homework Help
Replies
15
Views
2K
  • Calculus and Beyond Homework Help
Replies
10
Views
2K
  • Calculus and Beyond Homework Help
Replies
10
Views
1K
  • Calculus and Beyond Homework Help
Replies
4
Views
1K
Back
Top