1. Limited time only! Sign up for a free 30min personal tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Vector Subspace Proof

  1. Sep 25, 2009 #1
    Hey guys, I've got another problem I could use some assistance with.

    "In this problem we suppose that F is a field, A is an m by n matrix over
    F and that W is a subspace of Fm.
    (a) Show that U = {v [tex]\in[/tex] Fn: Av [tex]\in[/tex] W} is a subspace of Fn.
    (b) Now suppose that m = n and A is invertible, and that B = {v1, v2,...vk} is a basis for W. Show that {A-1v1, .... A-1vn} is a basis for U."

    For a) The way I'm understanding it is by looking at a 3x4 matrix. With such a matrix, v is a 4x1 vector, each product of Av is a 3x1 matrix, and includes only those that are a subset of W, i.e. are independent of each other. There will only be a maximum of n products that satisfy this result, as the rank cannot exceed the lesser of m or n. Am I on the right track for completing the proof?

    as for b) I'm a little less clear on. I'm thinking I'd multiply both Av [tex]\in[/tex] W by A-1, and since W is in turn composed of the basis (v1 to vn) I'd multiply each of those vectos by A-1 to form my basis for U.

    Any suggestions or comments to my reasoning and approach to the problems would be greatly appreciated.
     
  2. jcsd
  3. Sep 25, 2009 #2

    Office_Shredder

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    To prove something is a subspace, you need to prove it's closed under two operations. Do you know how to do this?
     
  4. Sep 25, 2009 #3
    Yes, I'm well aware of that property. I'm just a little confused by the requisite condition placed on the subspace, and the differing use of both Fm and Fn in the problem's description.
     
  5. Sep 25, 2009 #4
    It's very easy. Think about the properties of linear transformations.
     
  6. Sep 25, 2009 #5
    We aren't actually covering linear transformations until next week, although the assignment is due on Monday. He always puts problems on the assignments that he doesn't cover until after they're due >_>. I thought I could solve the problem with just what I knew about inverses, vector spaces, and bases, but I guess I should do some reading about linear transformations then? Anything I should focus on?
     
  7. Sep 25, 2009 #6
    Well you can essentially consider a linear transformation to be a matrix, so what you know about matrices should do the trick: You know that A(x + y) = Ax + Ay and that A(cx) = c(Ax) where c is a scalar and x, y are vectors, right? Those property defines a linear transformation.
     
  8. Sep 25, 2009 #7
    Ah, so I take it that in a), our matrix A is mapping each vector from Fn to Fm? So I'd just show that any vector that can successfully be mapped form a subspace by proving that v1 + v2 is still in the subspace of Fn (and can be mapped into Fm by A), and likewise for scalar multiplication? I feel like I'm overly complicating things :/
     
  9. Sep 25, 2009 #8
    Well you're overcomplicating things because it's a give that because v1 and v2 lie in W, any linear combination of them does as well. Also, what would a vector that can't be successfully mapped look like? (hint: it would look like a unicorn or a teapot floating around the sun or something because it doesn't exist!)
     
  10. Sep 25, 2009 #9
    Wait, isn't W in Fm, while v1 and v2 are in Fn? Aren't v1 and v2 only in W (and hence Fm) upon being mapped by A?
     
  11. Sep 25, 2009 #10
    Let's get our terminology straight and keep (a) different from (b) to keep things clear.

    let v1 and v2 lie in W which is in Rn. Instead of labeling their image in Rm, let's just write it as Av1 and Av2. Does that make sense? We know that v1 and v2 lie in W and we also know that v1 + v2 lies in W along with any other linear combination of them. Plus we have the fact that A respects vector addition and scalar multiplication as I explained in post #6. Do you think that you can organize those pieces of information into a proof of part (a)?
     
    Last edited: Sep 25, 2009
  12. Sep 25, 2009 #11
    Yeah, I think I can write it as such. The only thing I'm unclear of as is to why v1 and v2 lie in W when the v's are in Fn while W is in Fm?
     
  13. Sep 25, 2009 #12
    My bad. I switched m and n. I'm used to doing it the other way. W lies in Fm.

    For the second one the vi all lie in Fn so just write their image under A-1 (which lies in Fm) as A-1vi and that should get you started.

    If you want, post your proofs and I or somebody else can check them for you.
     
  14. Sep 26, 2009 #13
    So for the first part, because the v's lie in Fn and W is in Fm, the v's don't actually lie in W, correct? Instead the transformation under A brings them into W, so we'd have to show that Av1 and Av2 form a subspace?
     
  15. Sep 26, 2009 #14
    Yeah that's almost right. You have to show that if v1 gets mapped to W and v2 gets mapped to W (and so both lie in U), then any linear combination of v1 and v2 gets mapped to W because then that linear combination would lie in U as well. Use the properties of linearity of A as I outlined in post #6 and the subspace properties of W ;)
     
  16. Sep 26, 2009 #15
    Here are my answers to the proofs. Could you let me know what you think?

    For a) I wrote "A is a matrix that maps a vector from Fn to Fm. Each v lies in Fn, and is then transformed by A to lie in W, which is in Fm. U is a subspace of Fn, and consists of all such vectors that are in Fn and that are mapped into W by A. This is a subspace because A(v1 + v2) = Av1 + Av2, which is still in W, and hence, v1 + v2 is still in U. Therefore it's closed under addition. To show that it's closed under scalar multiplication, we have A(cv1) = c(Av1), which is still mapped to W, and hence, cv1 is still in U.

    b) Because A maps from Fn to Fm, or more precisely, from U to W, it's inverse does the opposite. Because the vectors in W's basis are independent by definition, taking the inverse leaves them independent still.

    I'm just not sure how to show that it still spans all of U. Does it just follow from the definition?
     
  17. Sep 26, 2009 #16
    (a) looks good. (b) is just stating what you want to show without actually showing it.

    Also, in (b), it would be less confusing to let m = n.

    The first step in the proof is to take arbitrary u in U. Then Au lies in W and so can be expressed as Au = c1v1 + ... + cnvn with the basis B given in the statement of part (b).

    A comment on part (a):

    Put in words, what was said is that a subspace in the target of a linear map/matrix is the image of a subspace in the domain. A converse statement which is also true is that the image of a subspace under a linear map is also a subspace. The reason for this is very simple: subspaces are defined in terms of vector addition and scalar multiplication. Because a linear transformation/matrix preserves these two operations, they also preserve subspaces. It would be a good exercise (very similar to (a)) to prove the converse statement that I just gave you.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Vector Subspace Proof
  1. Subspace proof (Replies: 8)

  2. Subspace Proof (Replies: 2)

Loading...