1. Not finding help here? Sign up for a free 30min tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Linear Algebra augmented matrix

  1. Apr 29, 2006 #1
    1. The augmented matrix for a system of linear equations in the variables x, y and z is given below:

    [ 1...-1.....1...|..2 ]
    [ 0....2...a -1.|..4 ]
    [ -1...3.....1...|..b ]

    *It's a 3x3 augmented matrix btw. Can't do the big square brackets, so I made do with the smaller ones...*

    For which values of a and b does the system have:
    a) no solutions;
    b) exactly one solution;
    c) infinitely many solutions?

    For the values of a and b in c), find all solutions of the system.

    2. a) Show that the set of all vectors (x, y, z) such that x + y + z = 0 are subspaces of R^3 (Euclidean space).

    b) Let u = (2$, -1, -1), v = (-1, 2$, -1) and w = (-1, -1, 2$),

    i) For what real values of $ do the vectors u, v and w form a linearly dependant set in R^3?

    ii) For each of these values express one of the vectors as a linear combination of the other two.
     
  2. jcsd
  3. Apr 30, 2006 #2

    0rthodontist

    User Avatar
    Science Advisor

    Where are you having trouble?
     
  4. Apr 30, 2006 #3
    For 1., I'm not sure where to start. I know it is an inhomogenous system with the left part of the matrix being "A", the unknowns x, y, z being x and the right part of the matrix being b.
    Then if Rank(A) = Rank(A|b) and < than the number of unknowns, there are infinite solutions; if = to the number of unknowns, then there is 1 solution. Now, if Rank(A) =/= Rank(A|b), there are no solutions.

    But the problem for me is how to find the unknowns to get to that stage?

    For 2a), I tried to go:

    Let S be the set of all vectors of the form (0, 0, 0) and let u and v be vectors in S.

    Therefore: u = (u, 0, 0) and v = (v, 0, 0), u, v є R

    Now we test:
    i) u + v = (u, 0, 0) + (v, 0, 0) = (u + v, 0, 0)
    Therefore u + v є S (since u + v є R)

    ii) ku = k(u, 0, 0), k є R
    = (ku, 0, 0)

    Therefore: ku є S

    Therefore: S is a subspace of R^3

    and for 2b), I'm not sure how to find the values. If there wasn't the unknown $ and it asked to determine whether it was linearly dependant, then I could do it; but this is just puzzling me.
     
  5. Apr 30, 2006 #4

    HallsofIvy

    User Avatar
    Staff Emeritus
    Science Advisor

    How about starting by doing exactly what you would do to solve the matrix equation:
    [tex]\left(\begin{array}{cccc} 1 & -1 & 1 & 2 \\ 0 & 2 & a- 1 & 4 \\ -1 & 3 & 1 & b\end{array}\right)[/tex]

    Since you already have a 1 in the "first column, first row", and a 0 in the "first column, second row", you need to get a 0 in the "first column, third row" and you can do that by adding the first row to the last row.

    Continue row-reducing as far as you can. Of course, at some point you may need to divide by something involving a- you can do that and get a single unique solution as long as that "something" is not 0. For what value of a is that "something" 0? In that case, you wind up with a third row consisting of all 0s except possibly the fourth column. What does that mean? What happens if the fourth column is also 0 and for what values of a and b does that happen?
     
    Last edited: Apr 30, 2006
  6. Apr 30, 2006 #5
    Reply

    I do not really understand your working for question 2a. Does S represent the set of all vectors (x, y, z) such that x + y + z = 0? If yes, then your vectors u and v are not in S.
    Instead, why not define u = [tex] (x_{1} \ y_{1} \ z_{1}) [/tex] and v = [tex] (x_{2} \ y_{2} \ z_{2}) [/tex], where [tex] x_{1} + \ y_{1} + \ z_{1} = 0 \ and \ x_{2} +\ y_{2} +\ z_{2} = 0[/tex]. You can then carry on proving from here!

    For question 2b, just do what you normally would had there not been any unknowns! After you finish the row operations, set the last row to be a zero row to determine the values of $ for linear dependence. Do you know why this is done?
     
    Last edited: Apr 30, 2006
  7. May 1, 2006 #6
    For 2a), I wrote:

    let S be the set of all vectors of the form (x, y, z) such that x + y + z = 0; x, y, z є R; and let u and v be vectors in S

    Therefore: u = (x1, y1, z1 and v = (x2, y2, z2), where x1 + y1 + z1 = 0 and x2 + y2 + z2 = 0, and x1, x2, y1, y2, z1, z2 є R

    i) u + v = (x1 + x2, y1 + y2, z1 + z2)
    Therefore u + v є S (since x1+x2, y1+y2, z2+z2 є R)

    ii)ku = (kx1, ky1, kz1), k є R
    Therefore: ku є S (since kx1, ky1, kz1 є R)

    Therefore: S is a subspace of R^3


    Working on 2b). For 1, I row reduced, but then I get weird forms of a in the 3rd column.

    Btw, where can I read a tutorial on how to use the mathematical text that you guys use?
     
  8. May 1, 2006 #7

    HallsofIvy

    User Avatar
    Staff Emeritus
    Science Advisor

    It doesn't follow that the sum is in S just because the indvidual components are in R. You need to show that the sum vector also satisfies the definition of S: that (x1+ x2)+ (y1+y2)+ (z1+ z2)= 0.

    Once again, just saying that the components are in R is not sufficient. You must show that kx1+ ky1+ kz1= 0.

    Yes, provided you clean up (i) and (ii).


    You have u = (2x, -1, -1), v = (-1, 2x, -1) and w = (-1, -1, 2x) and are asked for what values of x they are linearly dependent. Row reducing will work and can be simplified by putting (-1, -1, 2x) as the first row, (-1, 2x, -1) as the second row, and (2x, -1, -1) as the third row. Doing that I get -4x2+ 2x- 2 as the remaining number in the third row and these will be linearly dependent if that is 0.


    There is a tutorial on LaTex formatting in the "Tutorials" section under "Science Education":
    https://www.physicsforums.com/showthread.php?t=8997
     
    Last edited: May 1, 2006
  9. May 1, 2006 #8
    Ok, thanks!



    Yeah, I worked that out myself, and was just about to post an update. I apologize for not posting my remark clearly; it should have been:

    "I'm working on 2b now.
    However, for 1., I row reduced, but then I get weird forms of a in the 3rd column."


    Anyway, I got $ = 1 or -1/2. Furthermore, when $ = -1/2, the second row is a zero row as well, from my final matrix of ($ replaced by [tex]\lambda[/tex]):

    [tex] \left(\begin{array}{ccc|c}1 & 1 & -2\lambda & 0\\0 & 2\lambda + 1 & -2\lambda -1 & 0\\0 & 0 & 4\lambda^2 -2\lambda -2 & 0\end{array}\right)[/tex]


    Thanks dude, you've been a great help!
     
    Last edited: May 1, 2006
  10. May 1, 2006 #9
    So for [tex]\lambda[/tex] = 1,

    [tex] \left(\begin{array}{ccc|c}1 & 1 & -2 & 0\\0 & 3 & -3 & 0\\0 & 0 & 0 & 0\end{array}\right)[/tex]

    [tex]c_{3}[/tex] is arbitrary; [tex]c_{3} = t, t \epsilon[/tex] R
    [tex]3c_{2} = 3c_{3}
    c_{1} = 2c_{3} - c_{2} = 2t -t = t[/tex]

    (because [tex]c_{1}\left(\begin{array}{ccc}2\lambda & -1 & -1\end{array}\right) + c_{2}\left(\begin{array}{ccc}-1 & 2\lambda & -1\end{array}\right) + c_{3}\left(\begin{array}{ccc}-1 & -1 & 2\lambda\end{array}\right) = \mathbf {0}[/tex])

    solution space: [tex]\left(\begin{array}{ccc}c_{1} & c_{2} & c_{3}\end{array}\right) = t\left(\begin{array}{ccc}1 & 1 & 1\end{array}\right)[/tex] where [tex]t \epsilon[/tex] R

    let [tex]t = 1, \left(\begin{array}{ccc}c_{1} & c_{2} & c_{3}\end{array}\right) = \left(\begin{array}{ccc}1 & 1 & 1\end{array}\right)A\mathbf{x} = \mathbf{b}[/tex]
    [tex]\left(\begin{array}{ccc}2\lambda & -1 & -1\end{array}\right) + \left(\begin{array}{ccc}-1 & 2\lambda & -1\end{array}\right) + \left(\begin{array}{ccc}-1 & -1 & 2\lambda\end{array}\right) = \mathbf{0}[/tex]
    [tex]\left(\begin{array}{ccc}2\lambda & -1 & -1\end{array}\right) = - \left(\begin{array}{ccc}-1 & 2\lambda & -1\end{array}\right) - \left(\begin{array}{ccc}-1 & -1 & 2\lambda\end{array}\right)[/tex]

    And I'll leave out -1/2 because it takes ages to type out all that Latex...
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?



Similar Discussions: Linear Algebra augmented matrix
Loading...