Affine independence in terms of linear independence

Click For Summary
SUMMARY

This discussion focuses on the relationship between affine independence and linear independence in vector spaces, specifically in the context of vectors in \(\mathbb{R}^n\) and \(\mathbb{R}^{n+1}\). The user seeks to understand how the linear independence of a family of \(n\) vectors from an arbitrary origin vector implies the linear independence of the entire family of \(n+1\) vectors. Key points include the clarification of notation and the suggestion to visualize the problem using concrete examples, such as plotting vectors in \(\mathbb{R}^2\) and \(\mathbb{R}^3\).

PREREQUISITES
  • Understanding of linear independence in vector spaces
  • Familiarity with affine independence concepts
  • Knowledge of vector notation and operations in \(\mathbb{R}^n\)
  • Basic skills in visualizing vectors in Euclidean spaces
NEXT STEPS
  • Explore the concept of affine independence in detail
  • Learn about linear transformations and their impact on vector independence
  • Study examples of vector families in \(\mathbb{R}^2\) and \(\mathbb{R}^3\)
  • Investigate the geometric interpretation of linear combinations of vectors
USEFUL FOR

Mathematicians, physics students, and educators looking to deepen their understanding of vector independence concepts and their applications in higher-dimensional spaces.

Wiseguy
Messages
6
Reaction score
0
This question mostly pertains to how looking at affine independence entirely in terms of linear independence between different families of vectors. I understand there are quite a few questions already online pertaining to the affine/linear independence relationship, but I'm not quite able to find something that helps my particular problem, nor am I able to make the connection on my own.

I want to try and understand how the linear independence of a family of ##n## difference vectors from any arbitrary 'origin' vector, say ##(\overrightarrow{a_i a_0}, \ldots, \overrightarrow{a_i a_j}, \ldots \overrightarrow{a_i a_n})## where ##a_i\ and\ a_j \in \mathbb{R}^{n}## and ##j \neq i## for any arbitrary 'origin' ##i \in I##, implies the linear independence of the whole family of ##(n+1)## vectors ##(\hat{a_0}, \ldots, \hat{a_n})## where ##\hat{a_j} = (1, a_j)##

I am able to understand this from the perspective of using families of points, but I am unable to visualize how I would construct this only using families of vectors. I've tried looking at the vectors as position vectors, but I think that way of thinking would not necessarily be correct.
 
Last edited:
Physics news on Phys.org
Hello and welcome to physicsforums.

I'm afraid your notation is quite unusual.

What does ##\overrightarrow{a_i a_0},## represent? Given that you've said ##a_i\in\mathbb{R}## that would suggest that ##\overrightarrow{a_i a_0}=a_0-a_i\in\mathbb{R}##, which is a scalar. You can think of that as a vector if you like, but ##\mathbb{R}## as vector space has only one dimension, so you can't have more than one linearly independent vector in it..

What does the right hand side of ##
\hat{a_j} = (1, a_j)
## represent?
 
andrewkirk said:
Hello and welcome to physicsforums.

I'm afraid your notation is quite unusual.

What does ##\overrightarrow{a_i a_0},## represent? Given that you've said ##a_i\in\mathbb{R}## that would suggest that ##\overrightarrow{a_i a_0}=a_0-a_i\in\mathbb{R}##, which is a scalar. You can think of that as a vector if you like, but ##\mathbb{R}## as vector space has only one dimension, so you can't have more than one linearly independent vector in it..

What does the right hand side of ##
\hat{a_j} = (1, a_j)
## represent?

Thank you for your welcome.

I apologize. I meant to write ##\mathbb{R}^{n}##, not ##\mathbb{R}##. Yes, the notation ##\overrightarrow{a_i a_0},## is just used to represent ##(a_i - a_0) \in \mathbb{R}^{n}##. We can keep it in the latter form if it makes more sense.

And ## \hat{a_j} ## is just that. A vector ##\in \mathbb{R}^{n+1}## comprising of ##(1, a_j)##. I would like to know the intuition as to why the linear independence of these forms are equivalent.
 
Is it a proof, or a visualization, that you are missing? If it's a visualization, why not take a small concrete example.
The easiest that still has vector structure is n=2. Take for instance a0=(1,1), a1=(2,2), a2=(1,2). Draw a picture of these in ##\mathbb{R}^2## and then another of what you get with the move into ##\mathbb{R}^3##.
 

Similar threads

  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 11 ·
Replies
11
Views
2K
  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 14 ·
Replies
14
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K