Linear Algebra Proofs for nxn Matrices | Homework Assistance

Click For Summary

Homework Help Overview

The original poster is seeking assistance with three proofs related to linear algebra, specifically concerning nxn matrices. The first two proofs involve the relationship between the spanning of Rn and linear independence of the columns of a matrix A, while the third proof pertains to eigenvalues and determinants.

Discussion Character

  • Exploratory, Conceptual clarification, Mathematical reasoning

Approaches and Questions Raised

  • Participants discuss the implications of the column space of A equating to Rn and its effects on rank and nullity, as well as the relationship between linear independence and spanning. They also explore the definition of eigenvalues in relation to determinants and the existence of non-zero vectors.

Discussion Status

Some participants have provided alternative perspectives on the original poster's reasoning, suggesting ways to formalize their thoughts and clarifying the implications of linear independence and spanning. There appears to be productive engagement with the concepts, though no consensus has been reached.

Contextual Notes

The original poster is constrained by the requirement to prove the first two statements without referencing the invertible matrix theorem, which adds complexity to their attempts.

rhyno89
Messages
18
Reaction score
0

Homework Statement


Ok so I am stick on three proofs for my linear algebra final adn help on any of all of them would really help with my studying

For the first 2 assume that A is an nxn matrix

1.If the collumns of A span Rn then the homogenous system Ax = 0 has only the trivial solution
2. If the collumns of A are linearly independent, then the columns of A span Rn

These 2 have to be proved without referencing other parts of the invertible matrix theorem

And then,

3. Be able to prove: If A is an nxn matrix then lambda is an eigencalue of A if and only if det(A-lamda*In) = 0


Homework Equations





The Attempt at a Solution



the first two i have a better idea at than the third, since its an nxn i know that if it spans Rn then it has a pivot in every row and coincidently in every row and therefore every column. This same fact can be used to explain number 2 with them being linearly independent. My problem with these two is that I am having trouble since I can't reference them being part of the Invertible matrix theorem.

For the third one I think I am on the right track but not sure
The determinat of A-IL must equal zero because the determinant of A is simply the product of the eigenvalues. If you replace each eigenvalue into the determinat one at a time and multiply it by I, one of the entries will be replaced by zero and any other vlues multiplied by zero will result in zero. As a result, the determinat must equal zero.

Any help would be great
 
Physics news on Phys.org
A somewhat different take on 1 and 2 than yours (I suspect yours could be formalized as well, but I find the following approach more natural):
For 1: In other words you are told that the column space of A equals R^n. Then what is the rank and nullity of A? Knowing the nullity what can you tell about the space of solutions?

For 2: In other words the n-columns are linearly independent in the column space whose dimension is at most n. Can n-linearly independent vectors possibly fail to span a n-dimensional space?

For 3: For \lambda to be an eigenvalue means that there exists a non-zero vector v such that,
Av = \lambda v = \lambda I_n v
which is equivalent to:
(A-\lambda I_n)v = 0
If A-\lambda I_n has an inverse B, try to left-multiply the above equality by B.

For the other direction assume \det(A-\lambda I_n)=0 which is equivalent to A-\lambda I_n not having an inverse, or having nullity >0. Since the nullity is >0 the null space contains a non-zero vector v which you can show is a eigenvector with eigenvalue \lambda.
 
Ok I think i can take it from here, thanks a lot you really saved me with this
 
In (1) you have n vectors that span an n dimensional space.

In (2) you have n vectors in an n dimensional space.

Do you recall that a basis for a space has three properties- but that any two are enough to prove the third?
 

Similar threads

Replies
4
Views
2K
  • · Replies 3 ·
Replies
3
Views
1K
Replies
15
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 25 ·
Replies
25
Views
4K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 24 ·
Replies
24
Views
4K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 8 ·
Replies
8
Views
3K
Replies
8
Views
2K