Linear Independence: Writing vectors as linear combinations

In summary, you attempted to solve a linear system, and because the set of vectors is linearly dependent, there is no solution.
  • #1
skylit
7
0
Forgive me for not writing in latex, but I searched this site for 10 minutes looking for a latex reference and could not find anything on matrices. Also, excuse for the excessive amount of info.

Homework Statement



Determine whether this list of 3 polynomials in P1:
p1 = 1+3x
p2 = 1+2x
p3 = 2+3x
is linearly independent. If not, write one of the pi in terms of the others.
To test independence, let's see if the linear system
x1 p1 + x2 p2 + x3 p3 = 0 has any non-trivial solutions.
First, write the coefficient matrix A for a linear system representing the polynomial equation.

Homework Equations



The Attempt at a Solution



I reduced the matrix
[1 1 2]
[3 2 3]

to

[1 0 -1]
[0 1 -3]
I set Ax=0.
I found that the set {pi} is linearly dependent, and that a non-trivial solution to AX = 0 is (0,0,0) (Is this always the case? Pretty much all of the problems that I've come across have this as a solution for AX=0)

Now, my the next part is my issue. The directions then read:
"In particular, your solution to AX = 0 implies
= 0
Now use this linear dependence relation among the vectors { p1, p2, p3 } to write one of these vectors as a linear combination of the others."


This is where I am totally lost. I'm solving problems via a website, and every attempt to this solution has returned wrong. I've guessed every combination of p1 = p2 + p3 and I am not sure how to approach this last question.

There are numerous problems that ask the same question listed above. It would be greatly appreciated if someone could give an explanation and the proceeding steps to solving a problem like this as I will have to apply it to other similar problems.
 
Last edited:
Physics news on Phys.org
  • #2
I am not too sure exactly what you're doing here but your first step was fine, you wrote:
[tex]
a_{1}p_{1}+a_{2}p_{2}+a_{3}p_{3}=0
[/tex]
And came up with an over determined system which shows that there is no solution and therefore the set is linearly dependent.

Not what you do is write:
[tex]
p_{1}=b_{1}p_{2}+b_{2}p_{3}
[/tex]
and equate coefficinte of the powers of x, this should give the answer you want.
 
  • #3
skylit said:
Forgive me for not writing in latex, but I searched this site for 10 minutes looking for a latex reference and could not find anything on matrices. Also, excuse for the excessive amount of info.

Homework Statement



Determine whether this list of 3 polynomials in P1:
p1 = 1+3x
p2 = 1+2x
p3 = 2+3x
is linearly independent. If not, write one of the pi in terms of the others.
To test independence, let's see if the linear system
x1 p1 + x2 p2 + x3 p3 = 0 has any non-trivial solutions.
First, write the coefficient matrix A for a linear system representing the polynomial equation.

Homework Equations



The Attempt at a Solution



I reduced the matrix
[1 1 2]
[3 2 3]

to

[1 0 -1]
[0 1 -3]
I set Ax=0.
I found that the set {pi} is linearly dependent, and that a non-trivial solution to AX = 0 is (0,0,0) (Is this always the case? Pretty much all of the problems that I've come across have this as a solution for AX=0)

Now, my the next part is my issue. The directions then read:
"In particular, your solution to AX = 0 implies
= 0
Now use this linear dependence relation among the vectors { p1, p2, p3 } to write one of these vectors as a linear combination of the others."


This is where I am totally lost. I'm solving problems via a website, and every attempt to this solution has returned wrong. I've guessed every combination of p1 = p2 + p3 and I am not sure how to approach this last question.

There are numerous problems that ask the same question listed above. It would be greatly appreciated if someone could give an explanation and the proceeding steps to solving a problem like this as I will have to apply it to other similar problems.

Here is a matrix in LaTeX:

[tex] A =\left[ \matrix{1 & 1 & 2 \\ 3 & 2 & 3} \right][/tex]

The commands are "A = \left[ \matrix{ 1 & 1 & 2 \\ 3 & 2 & 3 } \right]". In a row the "&" is a column separator, and the "\\" is an end-of-row command. You get closed brackets by the "\left[ ... \right]" pair; you must *always* have a pair, not just one of them by itself. If you want rounded brackets, use "\left( ... \right)" instead; you can even have one square and one rounded:
[tex] B = \left( \matrix{1&2&3\\4&5&6}\right) , \; C = \left( \matrix{a+b & c+d \\ e & f} \right]. [/tex]

RGV
 
  • #4
One very quick point is that the space you are talking about is two dimensional. A basis is {1, x}. It is impossible to have three independent vectors in a two dimensional space.
 
  • #5
I have found it useful to remember that the dimension of a space is:

1. The number of vectors in its basis
2. The number of vectors in the largest possible set of linear independent vectors in the space
3. The number of vectors in the smallest possible set of vectors that spans the space

Thinking about how each one of these interacts with the others helped me a lot in understanding bases and dimension.
 
  • #7
"To test independence, let's see if the linear system
x1 p1 + x2 p2 + x3 p3 = 0
has any non-trivial solutions."

I would say ##(x1,x2,x3) = (0,0,0)## is a trivial solution, not a non-trivial solution.

By contrast, by inspection, ##(x1,x2,x3) = (1,-3,1)## is a non-trivial solution.
 

1. What is linear independence?

Linear independence refers to a set of vectors in a vector space that cannot be written as a linear combination of other vectors in the same space. In other words, none of the vectors in the set can be expressed as a combination of the other vectors.

2. How do you write a vector as a linear combination?

To write a vector as a linear combination, you need to express it as a sum of scalar multiples of other vectors. For example, if we have a vector v = (2, 3), we can write it as v = 2(1, 0) + 3(0, 1). This means that v is a linear combination of the vectors (1, 0) and (0, 1).

3. Why is linear independence important?

Linear independence is important because it allows us to determine whether a set of vectors can span a particular vector space. If the vectors are linearly independent, then they form a basis for the space, meaning they can span the entire space. If they are not linearly independent, then they cannot span the space.

4. How do you check for linear independence?

To check for linear independence, we can use the determinant method or the vector equation method. The determinant method involves constructing a matrix with the vectors as columns and calculating the determinant. If the determinant is non-zero, then the vectors are linearly independent. The vector equation method involves setting up a system of equations and solving for the coefficients. If there is only one solution, then the vectors are linearly independent.

5. What is the difference between linear independence and linear dependence?

The main difference between linear independence and linear dependence is that linearly independent vectors cannot be written as a linear combination of each other, while linearly dependent vectors can. In other words, linearly dependent vectors have a linear relationship with each other, while linearly independent vectors do not.

Similar threads

  • Calculus and Beyond Homework Help
Replies
1
Views
276
  • Calculus and Beyond Homework Help
Replies
12
Views
1K
  • Calculus and Beyond Homework Help
Replies
14
Views
591
  • Calculus and Beyond Homework Help
Replies
8
Views
793
  • Calculus and Beyond Homework Help
Replies
4
Views
1K
  • Calculus and Beyond Homework Help
Replies
2
Views
981
  • Calculus and Beyond Homework Help
Replies
24
Views
793
  • Calculus and Beyond Homework Help
Replies
10
Views
999
  • Calculus and Beyond Homework Help
Replies
2
Views
521
  • Calculus and Beyond Homework Help
Replies
0
Views
449
Back
Top