Proving the Basis Property of S' for Rn Using Invertible Matrices

  • Thread starter Clandry
  • Start date
  • Tags
    Basis
In summary: Later, yes, you can multiply by A-1 and obtain c1v1+...+cn*vn=0. But what is this "=span(S)"? That is certainly not what you wrote before.
  • #1
Clandry
74
0
Show that if S = {v1, v2, . . . , vn} is a basis for Rn
and A is an n × n invertible matrix, then
S' = {Av1,Av2, . . .,Avn} is also a basis.

I need to show that:
1) Av1, Av2,...Avn are linearly independent
2) span(S)=Rn

I'm having some problems with this.
I see that S'=AS (duh)

This is what I'mthinking of doing:
c1*A*v1+...cn*A*vn=0 where c1,...cn are constants and must be zero
=A*c1*v1+...A*cn*vn=0 can I multiply by A^-1 to both sides? If so then:
=c1v1+...+cn*vn=0=span(S)
thus S' is a basis?
 
Physics news on Phys.org
  • #2
Clandry said:
S
c1*A*v1+...cn*A*vn=0 where c1,...cn are constants and must be zero
=A*c1*v1+...A*cn*vn=0 can I multiply by A^-1 to both sides? If so then:
=c1v1+...+cn*vn=0=span(S)
thus S' is a basis?
That's the sort of thing, but your wording is all wrong. What do you mean by "c1,...cn are constants and must be zero"?
Start with a stated supposition, like 'elements of S' are not linearly independent'. Therefore there exist, etc.
 
  • #3
haruspex said:
That's the sort of thing, but your wording is all wrong. What do you mean by "c1,...cn are constants and must be zero"?
Start with a stated supposition, like 'elements of S' are not linearly independent'. Therefore there exist, etc.

well if they form a basis then the linear combination:
c1v1+...cnvn=0 can only have one solution for the constants which is 0. Or do I have it all wrong?
 
  • #4
Clandry said:
well if they form a basis then the linear combination:
c1v1+...cnvn=0 can only have one solution for the constants which is 0.
That's correct, but it's not what you wrote before. It may be what you had in mind, but the reader cannot tell that.
Also, you actually wrote c1*A*v1+...cn*A*vn=0, so clarifying what you wrote before with the explanation you've now given, I get:
if the elements of S' form a basis then the linear combination c1*A*v1+...cn*A*vn=0 can only have one solution for the constants which is 0​
True, but not a useful place to start. We're trying to prove that they do form a basis. A logical chain starting with the supposition that they do might prove they don't, but it won't prove they do.
Try reductio ad absurdum: if they are not linearly independent then there exist...
 
  • #5
haruspex said:
That's correct, but it's not what you wrote before. It may be what you had in mind, but the reader cannot tell that.
Also, you actually wrote c1*A*v1+...cn*A*vn=0, so clarifying what you wrote before with the explanation you've now given, I get:
if the elements of S' form a basis then the linear combination c1*A*v1+...cn*A*vn=0 can only have one solution for the constants which is 0​
True, but not a useful place to start. We're trying to prove that they do form a basis. A logical chain starting with the supposition that they do might prove they don't, but it won't prove they do.
Try reductio ad absurdum: if they are not linearly independent then there exist...

So wouldn't this:

c1*A*v1+...cn*A*vn=0 where c1,...cn are constants and must be zero
=A*c1*v1+...A*cn*vn=0 can I multiply by A^-1 to both sides? If so then:
=c1v1+...+cn*vn=0=span(S)

show that the vectors are linearly independent equal to span(S)?
 
  • #6
Clandry said:
c1*A*v1+...cn*A*vn=0 where c1,...cn are constants and must be zero
I've already explained that is not a meaningful statement. I can only check your logic if you write it out properly.
 
  • #7
haruspex said:
I've already explained that is not a meaningful statement. I can only check your logic if you write it out properly.

How about:

c1*A*v1+...cn*A*vn=0
Rearranging: A*c1*v1+...A*cn*vn=0 can I multiply by A^-1 to both sides? If so then:
Multiplying A^-1 to both sides gives: c1v1+...+cn*vn=0=span(S) since S is defined as a basis, then there are constants c1...cn that must be 0 for ^ the above to be true because v1...vn are linearly independent.

This shows that S' is linearly independent because c1...cn for c1*A*v1+...cn*A*vn=0 is equal to 0. Also, span(S')=R^n. S' is a basis.
 
  • #8
You still don't get what I'm on about. Starting with "c1*A*v1+...cn*A*vn=0" means nothing. Try wrapping some words around it:
"If the elements of S' are not linearly independent then there exist..."
Later, yes, you can multiply by A-1 and obtain c1v1+...+cn*vn=0. But what is this "=span(S)"? That is certainly not true.
 
  • #9
haruspex said:
You still don't get what I'm on about. Starting with "c1*A*v1+...cn*A*vn=0" means nothing. Try wrapping some words around it:
"If the elements of S' are not linearly independent then there exist..."
Later, yes, you can multiply by A-1 and obtain c1v1+...+cn*vn=0. But what is this "=span(S)"? That is certainly not true.
Oops I forgot to put that, but here:

If S' is a basis, then S' must be linearly independent, thus if S' is linearly independent then the following needs to be satisfied:
c1*A*v1+...cn*A*vn=0 where c1=...=cn must be 0.
Rearranging: A*c1*v1+...A*cn*vn=0 can I multiply by A^-1 to both sides? If so then:
Multiplying A^-1 to both sides gives: c1v1+...+cn*vn=0
Since S is a basis, the vectors in S are linearly independent. Span(S)=c1v1+...+cn*vn, where c1...cn are unique.

This shows that S' is linearly independent because c1...cn for c1*A*v1+...cn*A*vn=0 is equal to 0. Also, span(S')=R^n. S' is a basis.
 
  • #10
No. If you want to prove some fact X, it is not useful to start with "if X". Start with "if not X". I keep giving you the kick-off point: ""If the elements of S' are not linearly independent then there exist..."
Carry on from there.
 
  • #11
Why is that even necessary? It's saying the exact same thing. In every proof in my book that was asked to prove something, they prove the thing that was asked. They don't prove that the opposite is not true.
I don't understand how that is any more "correct" than what I did.
 
  • #12
Clandry said:
Why is that even necessary? It's saying the exact same thing. In every proof in my book that was asked to prove something, they prove the thing that was asked. They don't prove that the opposite is not true.
I don't understand how that is any more "correct" than what I did.
In general, you have some given facts, X, and want to prove a fact Y.
You can start with X and manipulate them to arrive at Y, or, using the "reductio ad absurdum" approach, you start by assuming Y is false, then show this is inconsistent with X being true.
What you have tried to do is neither of these. You have started by assuming Y is true and seeing what you can deduce from there. You will never prove Y that way.
 
  • #13
That looks fine to me.
 
  • #14
Okay how about this:

Suppose c1Av1+...+cnAvn=0. This is of the form Av=0 where v=c1v1+...+cnvn.
Multiplying A^-1 to both sides give v=0. Since v is a set of linearly independent vectors, c1=...=cn=0. Thus Av is linearly independent.
 
  • #15
Clandry said:
Okay how about this:

Suppose c1Av1+...+cnAvn=0. This is of the form Av=0 where v=c1v1+...+cnvn.
Multiplying A^-1 to both sides give v=0. Since v is a set of linearly independent vectors, c1=...=cn=0. Thus Av is linearly independent.
Almost there! To make it clear, you really should start as I indicated:
If the elements of S' are not linearly independent then there exist c1,..,cn, not all zero, such that c1Av1+...+cnAvn=0. Then, using your steps above, you show that c1,..,cn, are all zero, so the supposition that the elements of S' are not linearly independent must have been false.
Now, can you finish the problem by showing S' spans the space?
 
  • #16
haruspex said:
Almost there! To make it clear, you really should start as I indicated:
If the elements of S' are not linearly independent then there exist c1,..,cn, not all zero, such that c1Av1+...+cnAvn=0. Then, using your steps above, you show that c1,..,cn, are all zero, so the supposition that the elements of S' are not linearly independent must have been false.
Now, can you finish the problem by showing S' spans the space?

Hmmm, I completely forgot about that I had to prove that spans the space as well.

I can use the same idea except the RHS is some vector b in Rn

c1Av1+...+cnAvn=b
Multiplying both sides by A^-1 gives:
c1v1+...+cnvn=A^-1*b (on a side note I've been meaning to ask, when you multiply A^-1 to both sides, why does A^-1 end up in front of the vector b and not behind it?)
since v1...vn form a basis, this is solvable and thus it spans Rn.
 
  • #17
Another question: since it's already proven to be linearly independent, doesn't that mean it's going to automatically span Rn? I can't think of a situation where it wouldn't when it's linearly independent.
 
  • #18
Clandry said:
c1Av1+...+cnAvn=b
Multiplying both sides by A^-1 gives:
c1v1+...+cnvn=A^-1*b
Sorry, but you have the argument backwards again. To prove they span the space, start by taking an arbitrary vector of the space then obtain an expression for the vector in the form of c1Av1+...+cnAvn. I.e. you will need to show how to compute the ci.
You need to break yourself of the habit of launching into equations without thinking about the structure of the proof. Here's a guide:
- Decide whether you are trying to prove that something exists/can be done or that something does not exist/cannot be done.
- If you are trying to prove that something exists, you will normally be looking for a way to construct it. (In the spanning proof needed here, a representation of a given vector using the proposed basis.) Non-constructive proofs also arise, and can be exceedingly elegant, but they're in the minority.
- If trying to prove something does not exist, it is usually easier to use RAA (reductio ad absurdum): start by supposing it does exist then obtain a contradiction. This was appropriate for the linear independence part: suppose there is a linear relationship and show that contradicts the givens.
 

1. What is the basis property of S' for Rn?

The basis property of S' for Rn states that any vector in the subspace S' can be uniquely represented as a linear combination of the basis vectors for Rn.

2. Why is it important to prove the basis property of S' for Rn?

Proving the basis property allows us to understand the structure of the subspace S' and how it relates to Rn. It also allows us to use the basis vectors to easily represent any vector in S'.

3. What is the role of invertible matrices in proving the basis property?

Invertible matrices play a crucial role in proving the basis property as they allow us to transform the basis vectors of Rn into the basis vectors of S', which are then used to represent any vector in S'.

4. Can we use other methods to prove the basis property of S' for Rn?

Yes, there are other methods such as using linear independence and spanning sets, but using invertible matrices is often the most efficient and straightforward approach.

5. How can we apply the basis property of S' for Rn in real-world situations?

The basis property is a fundamental concept in linear algebra and has many applications in various fields such as physics, engineering, and computer science. It can be used to solve systems of linear equations, analyze data in multidimensional spaces, and even in quantum mechanics.

Similar threads

  • Calculus and Beyond Homework Help
Replies
1
Views
5K
  • Calculus and Beyond Homework Help
Replies
2
Views
2K
  • Calculus and Beyond Homework Help
Replies
1
Views
2K
  • Precalculus Mathematics Homework Help
Replies
17
Views
2K
  • Calculus and Beyond Homework Help
Replies
1
Views
2K
  • Calculus and Beyond Homework Help
Replies
1
Views
1K
  • Calculus and Beyond Homework Help
Replies
2
Views
2K
  • Calculus and Beyond Homework Help
Replies
4
Views
2K
  • Calculus and Beyond Homework Help
Replies
1
Views
948
  • Calculus and Beyond Homework Help
Replies
3
Views
7K
Back
Top