• Support PF! Buy your school textbooks, materials and every day products Here!

More Proofs (2/3 I think I've solved)

  • Thread starter gutnedawg
  • Start date
  • #1
35
0
1. Let A be an m × n real matrix of rank r whose m rows are all pairwise distinct. Let
s ≤ m, and let B be an s × n matrix obtained by choosing s distinct rows of A. Prove
that

rank(B) ≥ r + s − m.

Solution:
Assume that s is the largest amount of distinct rows of A.

r = n-dimNul A
dimNul A= m-s
r=n-m+s
rank(B) ≥ n-m +s + s − m
rank(B) ≥ n+ (-2m +2s)
rank(B) = n- dimNul B
dimNul B = 0 since all rows s are linearly independet
rank(B) = n
n ≥ n -2m +2s
0 ≥ -2m + 2s
2m ≥ 2s
m ≥ s (dividing by 2)
Since this is given doesn't this conclude the proof? Or should I plug in the 0 so
rank(B) ≥ n + x where 0≥x
n ≥ n + x

2. Let V be a vector space, let p ≤ m, and let b1, . . . , bm be vectors in V such that
A = {b1, . . . , bp} is a linearly independent set, while C = {b1, . . . , bm} is a spanning set
for V . Prove that there exists a basis B for V such that A ⊆ B ⊆ C.

Solution:
I'm going on the fact that it does not mention C is linearly independent, thus by the spanning set theorem there exists a linearly independent set of vectors {bi,...,bk} which spans V. Thus, this set {bi,...,bk} is a basis for V.

This means that the basis must at least be equal to A since B cannot be a basis for V if there is another linearly independent vecotr bp. Meaning:

[tex] A \subseteq B [/tex]

Also since B is a spanning set of V and is comprised of at least {b1,...,bp} it must be a subset of C since C also spans V and includes A.

Thus

[tex] A \subseteq B \subseteq C [/tex]

3. Let {v1 , v2, . . . , vm} be an indexed linearly dependent set of vectors in a vector space V such that v1 is not 0v . Prove that there exists exactly one index 2≤ i≤m with the property that the vector vi can be expressed as a linear combination of the preceding vectors v1, . . . , v(i-1) in a unique way.
 

Answers and Replies

  • #2
351
1
For #3:

Suppose you found j, the lowest index i such that v_i could be written as a unique linear combination of {v1, v2 ... v_(i-1)}. Could you then find another i > j that could be written as a unique linear combination of {v1, v2 ... v_(i-1)}.

Do you know why j has to exist?
 

Related Threads on More Proofs (2/3 I think I've solved)

Replies
9
Views
2K
  • Last Post
Replies
6
Views
939
Replies
2
Views
808
Replies
1
Views
618
  • Last Post
Replies
2
Views
2K
Replies
4
Views
951
  • Last Post
Replies
5
Views
5K
Top