Linear dependence and inner product space

In summary, theorem 6.3 states that if y = \sum^k_{i=1} a_i v_i then a_j = \langle y, v_j \rangle / ||v_j||^2 for all j.
  • #1
Defennder
Homework Helper
2,593
5

Homework Statement


The following is from the book Linear Algebra 3rd Edn by Stephen Friedberg, et al:
pg 327 said:
Let V be an inner product space, and let S be an orthogonal set of nonzero vectors. Then S is linearly independent. Proof:

Suppose that [tex]v_1, \ ... \,v_k \in S[/tex] and [tex]\sum_{i=1}^k a_i v_i = 0[/tex]

By theorem 6.3, aj = [tex]\langle 0,v_j \rangle / ||v_j||^2 = 0[/tex] for all j. So S is linearly independent.

Here aj are scalars of field F and vj are vectors of inner product space V.

Homework Equations


Theorem 6.3:
Let V be an inner product space, and let S = {v1, ... , vk} be an orthogonal set of non-zero vectors. If [tex]y = \sum^k_{i=1} a_i v_i[/tex] then [tex]a_j = \langle y, v_j \rangle / ||v_j||^2[/tex] for all j

The Attempt at a Solution


Now I don't understand why theorem 6.3 implies aj = 0 = [tex]\langle 0,v_j \rangle / ||v_j||^2 [/tex] for all j. I can see how this is zero if the numerator [tex]\langle 0,v_j \rangle[/tex] = 0, but the inner product isn't even defined yet and I don't see anywhere in the axioms that the inner product of the zero vector and an orthogonal vector would always be zero. So how does theorem 6.3 apply here?
 
Physics news on Phys.org
  • #2
Defennder said:
[...] the inner product isn't even defined yet
I find that hard to believe, since I don't see how you can talk about an inner product space without having defined an inner product :)
So I guess you should go back a bit and find that definition and you will see that it immediately follows from the properties of an inner product that <v, 0> = 0 for all v. For example:
  • Let x be any vector, then <v, 0> = <v, 0*x> (see definition of vector space) = 0 * <v, x> (by linearity of the inner product) = 0 (by multiplicative properties of the reals :smile:)
  • <v, 0> = <v, v - v> (by definition of the vector space) = <v, v> + <v, -v> (by additivity of the inner product) = <v, v> - <v, v> = 0. Of course, technically, v - v = v + (-v) with the existence of -v asserted by the definition of vector space, etc.
 
  • #3
CompuChip said:
I find that hard to believe, since I don't see how you can talk about an inner product space without having defined an inner product :)
Well I meant the specific inner product function, not the general notion of an inner product on a vector space.
  • Let x be any vector, then <v, 0> = <v, 0*x> (see definition of vector space) = 0 * <v, x> (by linearity of the inner product) = 0 (by multiplicative properties of the reals :smile:)
This is just what I need. Thanks!
 

1. What is linear dependence?

Linear dependence is a concept in linear algebra where a set of vectors can be expressed as a linear combination of other vectors. In other words, if one vector in a set can be written as a linear combination of the others, then the set is considered linearly dependent.

2. What is an inner product space?

An inner product space is a vector space equipped with an inner product, which is a mathematical operation that takes in two vectors and produces a scalar value. This operation is similar to the dot product, but it can also be defined for more abstract vector spaces such as complex vector spaces.

3. How do you determine if a set of vectors is linearly independent?

A set of vectors is considered linearly independent if none of the vectors can be expressed as a linear combination of the others. This means that the coefficients in the linear combination must all be equal to zero. One way to determine linear independence is by using the determinant of a matrix formed by the vectors. If the determinant is equal to zero, then the vectors are linearly dependent.

4. What is the significance of linear dependence and independence?

Linear dependence and independence are important concepts in linear algebra, as they help us understand the relationships between vectors and the dimensionality of vector spaces. In practical terms, linear dependence can also help us determine if a system of equations has a unique solution or not.

5. Can a set of linearly dependent vectors span a vector space?

Yes, a set of linearly dependent vectors can span a vector space. This means that even though the vectors are not all linearly independent, they can still be used to express any vector in the vector space as a linear combination of the dependent vectors.

Similar threads

  • Calculus and Beyond Homework Help
Replies
4
Views
1K
  • Math POTW for University Students
Replies
2
Views
863
  • Calculus
Replies
4
Views
485
  • Linear and Abstract Algebra
Replies
3
Views
1K
  • Calculus and Beyond Homework Help
Replies
9
Views
1K
  • Calculus and Beyond Homework Help
Replies
0
Views
441
  • Quantum Physics
Replies
8
Views
2K
  • Linear and Abstract Algebra
Replies
17
Views
2K
  • Calculus and Beyond Homework Help
Replies
9
Views
2K
  • Calculus and Beyond Homework Help
2
Replies
43
Views
3K
Back
Top