1. Not finding help here? Sign up for a free 30min tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Understanding Bases

  1. Jun 26, 2005 #1
    I have two homework problems that I am at a loss on where to start. I am going to see the TA tomorrow, but I would like to start on the problems tonight.


    The question is (the row vectors I show are actually written as column vectors on the homework):

    Consider the real vector space R>3. One basis we can use is called the Standard Basis. This is the basis
    B={|e1>=(1 0 0), |e2>=(0 1 0), |e3>=(0 0 1)}
    Show that the set of vectors
    B'={|a1>=(1 0 -1), |a2>=(1 2 1), |a3>=(0 -3 2)}
    is also a basis for R>3. Express each of the vectors of the standard basis in terms of these new basis vectors. (Hint: To check if you are on the right track, you should get that
    |e1>=-7/10|a1> + 3/10|a2> + 2/5|a3> = (-7/10 3/10 2/5)
    but you need to show this and find the other two.

    I need a little help getting started. I tried: B = {i+j,i-j,k} where i+j=|v> and i-j=|w> and i=1/2(v+w) and j=1/2(v-w). I have also tried combing a1, a2 & a3 to be a 3x3 matrix and multiply by each e, but that didn't work either.

    Thanks!
     
  2. jcsd
  3. Jun 26, 2005 #2
    In this case it suffices to show that vectors from B' are linearly independent. So, you can construct the determinant which has vector's coordinates as columns (or rows). Which property must have this determinant if a1, a2 and a3 are linearly independent?
    Do you know what is matrix for linear transformation? If yes, than the inverse of that will help you to find coordinates of standart basis vectors in new basis.
    P.S Hooray! It's my 100th post here! It's great place!
    P.P.S. Welcome, Blanik!
     
  4. Jun 26, 2005 #3
    Thanks! That helped alot. I am much closer, but I am still not coming up the same numbers for his |e1>.

    Here is what I did:
    If a1,a2 &a3 = matrix A,
    then A = 1 1 0
    0 2 -3
    -1 1 2

    The determinant of A = 10
    The inverse of A is:
    A-1 = (1/10) 7 -2 -3
    3 2 3
    2 0 2

    So, I get |e1> = 7/10
    3/10
    1/5
    But, the problem says |e1> should be = -7/10
    3/10
    2/5
    So, mine differs with the first value having the opposite sign and I am off by a factor of two for the last value.

    Any suggestions?
     
  5. Jun 26, 2005 #4
    There are 2 points:
    1. Your inverse is incorrect. I believe correct is [tex]1/10\[ \left( \begin{array}{ccc}
    7 & -2 & -3 \\
    3 & 2 & 3 \\
    2 & -2 & 2 \end{array} \right)\][/tex]

    2.
    You shouldn't get it if you're on the right track. In standart basis -7/10|a1> + 3/10|a2> + 2/5|a3> doesn't gives e1. I believe you see it. That hint is incorrect
     
  6. Jun 26, 2005 #5
    I thought that the hint might be wrong... thanks for clarifying. I caught my math error too for the zero value in the inverse matrix.

    Thanks so much for your help!
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?



Similar Discussions: Understanding Bases
  1. Acid or base? (Replies: 1)

  2. Understanding Inertia (Replies: 3)

Loading...