A couple questions for linear algebra final review

Click For Summary

Homework Help Overview

The discussion revolves around proofs related to linear algebra concepts, specifically focusing on properties of linear independence and invertibility of matrices. Participants are exploring definitions and implications of these properties in a real vector space.

Discussion Character

  • Conceptual clarification, Mathematical reasoning, Assumption checking

Approaches and Questions Raised

  • Participants discuss the definitions of linear independence and dependence, considering how subsets of vectors relate to the properties of the larger set. There is an exploration of using closure under scalar multiplication and addition as part of the proof process. Questions arise regarding the implications of matrix invertibility and the role of determinants in these proofs.

Discussion Status

Some participants have provided insights into the relationship between determinants and matrix invertibility, suggesting that this could simplify the proofs for parts c and d. Others are still working through the implications of their reasoning and seeking clarification on specific points.

Contextual Notes

Participants express a sense of urgency due to an upcoming final exam, which may influence the depth of their inquiries and the nature of the discussion. There is a recognition of the need for clarity in definitions and theorems related to linear algebra.

BiGyElLoWhAt
Gold Member
Messages
1,637
Reaction score
138

Homework Statement



2: Some proofs:
a) If ##\{ v_1 , v_2...v_n \} ## are linearly independent in a real vector space, so are any subset of them.

b) If any subset of vectors ##\{ v_1 , v_2...v_n \} ## in a real vector space are linearly dependent, then the whole set of vectors are linearly dependent.

c) If A is invertible so is A^2

d) if A is not invertible, neither is A^3

I think I either know everything else or I can't really ask without asking to be walked through it so I'll just have to spend some time on google for acouple of them.

Any help is appreciated, my final's in 18 hours.

Homework Equations





The Attempt at a Solution



2:
a)I was thinking I could use closure under scalar multiplication and addition to prove that, but is that solid enough?

If ##\{ v_1 , v_2...v_n \} ## are linearly independent then by definition

##c_1v_1 + ... + c_{n-1}v_{n-1} ≠ c_nv_n##

(is subset ≈ subspace?)

then for ##\{ v_1 , v_2...v_m \} ## where m≤n (not really sure how to notate that all the vectors of this set are contained within the first set) which are all members of the first set, are linearly independent.

This feels weak though... It seems like common sense that if i have some linearly independent vectors, and I throw some of them out, then what I have is still linearly independent; I'm also sure that I need to start at the definition of linearly independent, which basically involves the demonstration of closure under addition and scalar multiplication. I'm just not really sure how to put it technically. I think this is one of those "prove 2+2=4" things and I'm just not seeing how to put it mathematically.

2:
b) Very similar work, I could copy and paste everything I have from up ther and put it here, but once I figure out a, I'll get b.

2:
c) ##A^2 = AA## multiply by ##A^{-1}##
... ##AAA^{-1} = AI = A##
Proved.
I only put this in here because I'm not sure what to do about d. I know it's very similar, but is it simply
##A^3 = AAA## and since A is not invertible, A^3 cannot be reduced? How do I word that?
 
Physics news on Phys.org
BiGyElLoWhAt said:

Homework Statement



2: Some proofs:
a) If ##\{ v_1 , v_2...v_n \} ## are linearly independent in a real vector space, so are any subset of them.

b) If any subset of vectors ##\{ v_1 , v_2...v_n \} ## in a real vector space are linearly dependent, then the whole set of vectors are linearly dependent.

c) If A is invertible so is A^2

d) if A is not invertible, neither is A^3

I think I either know everything else or I can't really ask without asking to be walked through it so I'll just have to spend some time on google for acouple of them.

Any help is appreciated, my final's in 18 hours.

Homework Equations





The Attempt at a Solution



2:
a)I was thinking I could use closure under scalar multiplication and addition to prove that, but is that solid enough?

If ##\{ v_1 , v_2...v_n \} ## are linearly independent then by definition

##c_1v_1 + ... + c_{n-1}v_{n-1} ≠ c_nv_n##

(is subset ≈ subspace?)

then for ##\{ v_1 , v_2...v_m \} ## where m≤n (not really sure how to notate that all the vectors of this set are contained within the first set) which are all members of the first set, are linearly independent.

This feels weak though... It seems like common sense that if i have some linearly independent vectors, and I throw some of them out, then what I have is still linearly independent; I'm also sure that I need to start at the definition of linearly independent, which basically involves the demonstration of closure under addition and scalar multiplication. I'm just not really sure how to put it technically. I think this is one of those "prove 2+2=4" things and I'm just not seeing how to put it mathematically.

2:
b) Very similar work, I could copy and paste everything I have from up ther and put it here, but once I figure out a, I'll get b.

2:
c) ##A^2 = AA## multiply by ##A^{-1}##
... ##AAA^{-1} = AI = A##
Proved.
I only put this in here because I'm not sure what to do about d. I know it's very similar, but is it simply
##A^3 = AAA## and since A is not invertible, A^3 cannot be reduced? How do I word that?

For a) ##\{ v_1 , v_2...v_n \} ## are linearly independent when the vectors satisfy ##c_1v_1 + ... + c_nv_n = 0 \iff c_1=...=c_n=0##.

Suppose you take a smaller subset of the vectors, ##\{ v_1 , v_2...v_{n-1} \}##. Then these vectors are linearly independent when ##c_1v_1 + ... + c_{n-1}v_{n-1} = 0 \iff c_1 = ... = c_{n-1} = 0##.

Using the fact that ##c_1v_1 + ... + c_{n-1}v_{n-1} = c_nv_n##, you can say that the smaller subset of vectors is non-singular if ##c_nv_n = 0 \iff c_n = 0##.

It must be the case that ##c_n = 0##, otherwise you would have a contradiction. Why?

For part c) and d) what about the determinant?
 
Aha! thank you I didn't even think about the determinant. But is the only situation where A is non invertible when det(A)=0? If it is then that makes c and d very simple as det(A^n) = det(A)^n.
 
BiGyElLoWhAt said:
Aha! thank you I didn't even think about the determinant. But is the only situation where A is non invertible when det(A)=0? If it is then that makes c and d very simple as det(A^n) = det(A)^n.

If ##A## is invertible (non-singular), ##det(A) ≠ 0##.

If ##A## is not invertible (singular) ##det(A) = 0##.

It makes c) and d) quite straightforward.
 
  • Like
Likes   Reactions: 1 person
Awesome, thanks a million.
 

Similar threads

Replies
15
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 10 ·
Replies
10
Views
2K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 1 ·
Replies
1
Views
3K
Replies
34
Views
4K
Replies
5
Views
2K
  • · Replies 15 ·
Replies
15
Views
3K