Vector Space Proof: Proving Set S is a Basis

In summary, the conversation discusses proving that a set of vectors S is a basis for a vector space V if it satisfies a certain property. The property states that for any vector space W and any function f mapping S to W, there exists a unique linear transformation T mapping V to W such that T restricted to S is equal to f. The conversation also explores different approaches to proving this statement by considering cases where S is not linearly independent or does not span V.
  • #1
Carl140
49
0

Homework Statement



Let V be a vector space over a field K. Let S be a set of vectors of V,
S= {e_i : i in J} (i.e e_i in V for each i in J) where J is an index set.
Prove that if S satisfies the following property, then S must be a basis.

The property is:
For every vector space W over the field K and for every function f: S-> W there exists
a unique linear transformation T: V-> W such that T restricted to S = f.


The Attempt at a Solution



I really have no clue how to start this, of course I must prove its linearly independent
and its spans V.
 
Physics news on Phys.org
  • #2
Try doing proof by contradiction. If the set isn't linearly independent, find a function f such that no such T exists. If the set isn't spanning, find a function f such that more than one T exists

EDIT: Intuition-wise, you should know that if you describe a linear function on a basis, you've described the linear function entirely and uniquely. Basically, this is showing that if you add more elements (remove linear independence) then you have too much information and no longer necessarily have a linear function that satisfies it all, and if you remove elements (remove spanning) you don't have enough information for a unique T
 
Last edited:
  • #3
Hi, thanks for your reply but I don't see it yet. Let's see.
Suppose the set isn't linearly independent, then there exists some e_k in S such that
e_k is a linear combination of the elements of S, i.e
e_k = sum ( r_i * e_i) where r_i are elements of the field K.
Now I have to define a function f: S-> W such that no T exists. My problem is W is
an arbitrary vector space over K, so how can I send elements of S to W if we don't
have any information at all about W? could you please give a hint for this?
 
  • #4
Pick W to be... say, K (1-dimensional). W is arbitrary, so you're allowed to pick a specific W.
 
  • #5
Can we at least assume that "for every function f: S-> W" is actually "for every linear function f:S->W"? If not then the statement is not true.'

If S is NOT a basis the either
1) it does not span V or
2) it is not independent.

Given 1, construct a linear function f:S-> W so that it there is NO linear transfromation L:V-> W such that L restricted to S is f.

Given 2, construct a linear function f:S->W so that there are two different linear transformations L1:V->W and L2: V->W so that L1 and L2 restricted to S are both equal to f.
 
  • #6
HallsofIvy said:
Can we at least assume that "for every function f: S-> W" is actually "for every linear function f:S->W"? If not then the statement is not true.'

If S is NOT a basis the either
1) it does not span V or
2) it is not independent.

Given 1, construct a linear function f:S-> W so that it there is NO linear transfromation L:V-> W such that L restricted to S is f.

Given 2, construct a linear function f:S->W so that there are two different linear transformations L1:V->W and L2: V->W so that L1 and L2 restricted to S are both equal to f.

Here's what I did:
Suppose S is not linearly independent then we can write some e_k as:
e_k = sum( a_r * e_r) , r in J and a_r in K.
Now define f: S -> K by f(e_i ) = sum ( abs(a_r) , r in J) abs(a_r) means absolute value.
Then assume there exists a linear transformation such that for all s in S we have:
T(s) = f(s), then in particular for s= e_k we get
T(e_k = f(e_k) then T(sum( abs(a_r) * e_r)) = sum( abs(a_r) , r in J) so using linearity we
end up with sum ( |a_r| * [T(e_r) - 1] ) = 0 so T(e_r) = 1..and then? is this the right approach?
 

1. What is a vector space?

A vector space is a mathematical structure that consists of a set of vectors and two operations, addition and scalar multiplication, that satisfy certain properties. These properties include closure, associativity, commutativity, distributivity, and an identity element.

2. What does it mean for a set S to be a basis of a vector space?

A set S is considered a basis of a vector space if it meets two criteria: (1) the vectors in S are linearly independent, meaning that no vector in S can be written as a linear combination of the other vectors in S, and (2) the vectors in S span the entire vector space, meaning that every vector in the vector space can be written as a linear combination of the vectors in S.

3. How do you prove that a set S is a basis of a vector space?

To prove that a set S is a basis of a vector space, you must first show that the vectors in S are linearly independent. This can be done by assuming that the vectors in S can be written as a linear combination and then showing that the coefficients must all be zero. Next, you must show that the vectors in S span the entire vector space by showing that every vector in the vector space can be written as a linear combination of the vectors in S. Finally, you must also prove that S is the smallest possible set that satisfies these criteria.

4. What is the importance of a basis in vector spaces?

Bases are important in vector spaces because they provide a way to represent any vector in the vector space using a linear combination of a finite set of vectors. This makes it easier to perform calculations and analyze properties of vectors in the vector space. Bases also help to define the dimension of a vector space, which is a key concept in linear algebra.

5. Can a vector space have more than one basis?

Yes, a vector space can have multiple bases. In fact, any vector space with a finite dimension will have infinitely many bases. This is because there are many different ways to choose a set of linearly independent vectors that span the entire vector space. However, all bases of a vector space will have the same number of vectors, which is known as the dimension of the vector space.

Similar threads

  • Calculus and Beyond Homework Help
Replies
0
Views
440
  • Calculus and Beyond Homework Help
Replies
14
Views
576
  • Calculus and Beyond Homework Help
Replies
18
Views
1K
  • Calculus and Beyond Homework Help
Replies
7
Views
397
  • Calculus and Beyond Homework Help
Replies
34
Views
2K
  • Calculus and Beyond Homework Help
Replies
14
Views
3K
  • Calculus and Beyond Homework Help
Replies
9
Views
1K
Replies
1
Views
566
  • Calculus and Beyond Homework Help
Replies
17
Views
2K
  • Calculus and Beyond Homework Help
Replies
4
Views
1K
Back
Top