1. Not finding help here? Sign up for a free 30min tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Completing to a basis

  1. May 22, 2015 #1
    1. The problem statement, all variables and given/known data
    Consider in the space ##\mathbb{R}^5## vectors ##\vec{v}_1 = (2,1, 1, 5, 3)^T## , ##\vec{v}_2 = (3, 2, 0, 0, 0)^T## , ##\vec{v}_3 = (1, 1, 50, 921, 0)^T##.
    a) Prove that these vectors are linearly independent.
    b) Complete this system of vectors to a basis.
    If you do part b) first you can do everything without any computation.

    2. Relevant equations


    3. The attempt at a solution
    If I were to do a) first, I would put the 3 vectors in a matrix, get it to echelon form by row reduction and note that there is a pivot in every column. Even better - I could do the row reduction with additional two arbitrary vectors and choose their components such that the final echelon form has a pivot in every row and column. However, this method is cumbersome and requires tedious calculations. The question clearly suggests I do b) first to avoid all calculations (that's probably the reason for the hint and the ugly numbers in ##\vec{v}_3##).
    However, I do not see a way to choose two more vectors not belonging to span(v1,v2,v3) to complete to a basis without guessing or using the tedious row reduction suggested earlier (I could do it, but I prefer to find a more elegant approach).
    Any suggestions on the best method to solve this one?

    Any suggestions comments will be greatly appreciated!
     
  2. jcsd
  3. May 22, 2015 #2

    Zondrina

    User Avatar
    Homework Helper

    There isn't much to it. Row reducing the vectors to prove they are linearly independent will show that ##x_4## and ##x_5## are free.

    Writing out the solution set will show the span of two vectors forms a linearly independent basis.

    I think it would be difficult to see this basis directly, unless you assume the conclusion of a) is true at the outset of the problem. Then you would know how to assume the form of the linearly independent spanning vectors.
     
  4. May 22, 2015 #3
    Alright then, I guess I will have to do some dirty work :p.
    Thank you for the reply!
     
  5. May 22, 2015 #4

    Zondrina

    User Avatar
    Homework Helper

    I want to clarify what I said earlier, I feel as if I was a little ambiguous.

    If you know the vectors are linearly independent, then you know what the final form of the matrix will look like when you reduce ##A \vec x = \vec 0## before you even reduce it.

    If you have ##3## vectors in ##\mathbb{R}^5##, you know immediately there will be ##2## free variables because there will be ##2## full rows of ##0's## when the matrix is reduced.

    Since the vectors are linearly independent, only the trivial solution exists for the independent variables, i.e you can comfortably place ##0's## in many of the vector indices for the solution set without much thought.

    All that would be left to do is to place a ##1## in the index of each free variable for their respective vector in the solution set.

    The span of these vectors will form the basis without the need to row reduce.
     
  6. May 22, 2015 #5
    I think I get it now. Basically, assuming linear independence of the first 3 vectors, the two remaining vectors must be from the standard basis in ##\mathbb{R}^5## and all that's left is to find which ones?
     
  7. May 22, 2015 #6

    Mark44

    Staff: Mentor

    You will have at least 2 free variables, since the three vectors might be linearly dependent (coplanar or even collinear). In the previous paragraph you made the assumption that the three vectors were linearly independent, in which case the sentence above is correct, but I wasn't sure if that assumption still held in the next paragraph.For clarity, you might have written, "If you have ##3## linearly independent vectors in ##\mathbb{R}^5##, you know immediately there will be ##2## free variables..."
     
  8. May 22, 2015 #7

    vela

    User Avatar
    Staff Emeritus
    Science Advisor
    Homework Helper
    Education Advisor

    A better approach would be to start with the definition of linear independence and think more generally about how to solve the system of equations rather than resorting to using matrices. To show linear independence, you want to solve
    $$c_1\begin{pmatrix} 2\\1\\1\\5\\3\end{pmatrix} + c_2 \begin{pmatrix} 3\\2\\0\\0\\0 \end{pmatrix} + c_3 \begin{pmatrix} 1\\1\\50\\921\\0\end{pmatrix} = 0.$$ You should be able to see by inspection that ##c_1=0##. And it's pretty easy to show ##c_2 = c_3 = 0## follows with virtually no calculating.
     
  9. May 22, 2015 #8
    Ah, that's the simplicity I was looking for! (Is 'by inspection' a valid formal argument?) ##c_1=0## because of the ##3## in the bottom of ##\vec{v}_1##, ##c_3=0## because of the ##921## and ##c_2=0## because it's the last left standing, right? As for a possible completion to a basis, by inspection I think that these two will work: ##\vec{v}_4 = (0,0,1,0,0)^T ; \vec{v}_5 = (0,1,0,0,0)^T## or ##\vec{v}_5 = (1,0,0,0,0)^T##.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted



Similar Discussions: Completing to a basis
  1. Free basis and basis? (Replies: 7)

  2. A basis (Replies: 3)

Loading...