Linear Combinations of Dependent Vectors

Click For Summary

Homework Help Overview

The discussion revolves around the concept of linear combinations of linearly dependent vectors in a vector space. The original poster explores the implications of linear dependence on the number of solutions for expressing a vector in terms of a linear combination of a given set of vectors.

Discussion Character

  • Exploratory, Conceptual clarification, Mathematical reasoning

Approaches and Questions Raised

  • The original poster attempts to reason through the implications of linear dependence by considering finite versus infinite solutions for coefficients in a linear combination. They also explore the application of their reasoning across different vector spaces.
  • Another participant introduces the idea of manipulating the linear combination by adding a scalar multiple of the dependent relation to derive further insights.

Discussion Status

The discussion is active, with participants engaging in reasoning about the implications of linear dependence and exploring how different choices of scalars affect the linear combinations. There is a recognition of the relationship between the choices of coefficients and the resulting linear combinations.

Contextual Notes

Participants are navigating the definitions and properties of linear dependence and span within the context of vector spaces, questioning how these concepts apply universally across different dimensions and spaces.

Random Variable
Messages
114
Reaction score
0

Homework Statement



If (u,v,w) is a family of linearly dependent vectors in vector space V and vector x is in the span of (u,v,w), then x=αu+βv+γw has infinitely-many choices for α,β, and γ.

Homework Equations



If (u,v,w) is linearly dependent, then there exists an α, β, and γ, not all equal to zero, such that αu+βv+γw=0.

The Attempt at a Solution



My first attempt, which didn't go anywhere, was to assume that there were only a finite number of choices and see if that led to a contradiction.

For my second attempt, I started with the fact that (u,v,w) is linearly dependent. Then I multiplied both sides by an arbitrary scalar n. Then I thought I could add x to both sides and manipulate the equation somehow, but that didn't lead anywhere either.

If I was dealing with the vector space R^n, then x=αu+βv+γw would have infinitely-many solutions because you would end up with at least one free variable since at least one of the vectors is just a linear combination of the others. Can you apply that reasoning to all vector spaces?
 
Last edited:
Physics news on Phys.org
As you say, since (u,v,w) is linearly independent so there exist \alpha, \beta, \gamma, not all 0, such that \alpha u+ \beta v+ \gamma w= 0.

Further, since (u,v,w) spans V there exist, for any x in V, scalars a, b, c such that au+ bv+ cw= x.

Now, what can you say about au+ bv+ cw+ R(\alpha u+ \beta v+ \gamma w) for any real number R?
 
It equals x?

(Rα+a)u +(Rβ+b)v +(Rγ+c)w = x

And since Rα or Rβ or Rγ is not zero, the equation has infinitely-many solutions?
 
Yes, every different choice for R gives a different linear combination but they are all equal to x!

And it is doing exactly what YOU suggested:
For my second attempt, I started with the fact that (u,v,w) is linearly dependent. Then I multiplied both sides by an arbitrary scalar n. Then I thought I could add x to both sides and manipulate the equation
 

Similar threads

  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 7 ·
Replies
7
Views
1K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 10 ·
Replies
10
Views
3K
  • · Replies 2 ·
Replies
2
Views
1K