Understanding Vectors: Properties and Applications

  • Context: MHB 
  • Thread starter Thread starter mathmari
  • Start date Start date
  • Tags Tags
    Properties Vectors
Click For Summary

Discussion Overview

The discussion revolves around the properties and applications of vectors in the context of linear independence, basis representation, and orthogonality within the vector space \( V = \mathbb{R}^n \). Participants explore mathematical relationships and proofs related to these concepts.

Discussion Character

  • Technical explanation
  • Mathematical reasoning
  • Debate/contested

Main Points Raised

  • One participant proposes that for the sum of scaled vectors, the dot product with a basis vector yields the corresponding scalar coefficient, leading to the conclusion that \( \left (\sum_{i=1}^k\lambda_i b_i\right )\cdot b_j = \lambda_j \).
  • Another participant agrees with the linear independence of the vectors \( b_1, \ldots, b_k \) based on the derived condition that \( \sum_{i=1}^k\lambda_i b_i = 0 \) implies \( \lambda_j = 0 \) for all \( j \).
  • There is uncertainty expressed regarding how to show that the number of vectors \( k \) is less than or equal to \( n \), with one participant suggesting the property of independent vectors in an \( n \)-dimensional space.
  • Participants discuss the implications of \( B \) being a basis of \( V \) and how every vector \( v \) can be expressed as a linear combination of the basis vectors, specifically questioning the form \( v = \sum_{i=1}^n(v\cdot b_i)b_i \).
  • There is a consensus that to show the orthogonality of the matrix \( a \), it is necessary to demonstrate that \( a^Ta = I \) using the properties of the basis vectors.

Areas of Agreement / Disagreement

Participants generally agree on several mathematical relationships and the properties of the vectors discussed. However, there remains uncertainty regarding the proof of \( k \leq n \) and the reasoning behind the linear combination representation of \( v \).

Contextual Notes

Some assumptions regarding the definitions of linear independence and basis vectors are implicit in the discussion. The proof steps for certain claims, such as the orthogonality of matrix \( a \), are not fully resolved.

mathmari
Gold Member
MHB
Messages
4,984
Reaction score
7
Hey!

Let $1\leq n\in \mathbb{N}$, $V=\mathbb{R}^n$ and $\cdot$ the standard scalar multiplication. Let $b_1, \ldots , b_k\in V$ such that $$b_i\cdot b_j=\delta_{ij}$$
  1. Let $\lambda_1, \ldots , \lambda_k\in \mathbb{R}$. Determine $\displaystyle{\left (\sum_{i=1}^k\lambda_i b_i\right )\cdot b_j}$ for$1\leq j\leq k$.
  2. Show that $b_1, \ldots , b_k$ are linear independent and that $k\leq n$.
  3. Let $k=n$. Show that $B=(b_1, \ldots , b_n)$ is a basis of $V$ and it holds that $\displaystyle{v=\sum_{i=1}^n(v\cdot b_i)b_i}$ for all $v\in V$.
  4. Let $k=n$. Show that $a=(b_1\mid \ldots \mid b_n)\in O_n$.

I have done the following:

For 1:
We have that $$\left (\sum_{i=1}^k\lambda_i b_i\right )\cdot b_j=\sum_{i=1}^k\lambda_i \left (b_i\cdot b_j\right )=\lambda_j$$ or not? :unsure: For 2:
We have that $$\sum_{i=1}^k\lambda_i b_i=0 \ \overset{\cdot b_j}{\longrightarrow} \ \left (\sum_{i=1}^k\lambda_i b_i\right )\cdot b_j=0\cdot b_j \ \overset{\text{ Question } 1.}{\longrightarrow} \ \lambda_j=0$$ for all $1\leq j\leq k$, and so $b_1, \ldots , b_k$ are linear independent.
Is this correct?

How can we show that $k\leq n$? :unsure: For 3:
We have that the vectors of $B$ are linear independent, according to question 2, and the number of vectors equals the dimension of $V$. This imply that $B$ is a basis of $V$, right?
Since $B$ is a basis of $V$, every element of $V$ can be written as a linear combination of the elements of $B$. But why is this linear combination $\displaystyle{v=\sum_{i=1}^n(v\cdot b_i)b_i}$ ? Is this because of the definition of $b_i$, i.e. that $b_i\cdot b_i=1$ ? :unsure:For 4:
To show that the matrix $a$ is orthogonal, we have to show that $a^Ta=I=aa^T$ using the definition os the vectors $b_i$, i.e. that $b_i\cdot b_i=1$ and $b_i\cdot b_j=0$ fr $i\neq j$, right? :unsure:
 
Physics news on Phys.org
mathmari said:
For 1:
We have that $$\left (\sum_{i=1}^k\lambda_i b_i\right )\cdot b_j=\sum_{i=1}^k\lambda_i \left (b_i\cdot b_j\right )=\lambda_j$$ or not?

Hey mathmari!

Yep. :)

mathmari said:
For 2:
Is this correct?

How can we show that $k\leq n$?

Correct yes.

Hmm... I don't see an easy way to show that $k\le n$. :unsure:
Then again, perhaps we can use the property that a set of independent vectors in an $n$-dimensional vector space can have at most $n$ independent vectors? 🤔

mathmari said:
For 3:
We have that the vectors of $B$ are linear independent, according to question 2, and the number of vectors equals the dimension of $V$. This imply that $B$ is a basis of $V$, right?
Since $B$ is a basis of $V$, every element of $V$ can be written as a linear combination of the elements of $B$. But why is this linear combination $\displaystyle{v=\sum_{i=1}^n(v\cdot b_i)b_i}$ ? Is this because of the definition of $b_i$, i.e. that $b_i\cdot b_i=1$ ?

Yep.
Since $B$ is a basis we can write $v=\sum \lambda_j b_j$, can't we?
Suppose we use that to calculate $v\cdot b_i$... 🤔

mathmari said:
For 4:
To show that the matrix $a$ is orthogonal, we have to show that $a^Ta=I=aa^T$ using the definition os the vectors $b_i$, i.e. that $b_i\cdot b_i=1$ and $b_i\cdot b_j=0$ fr $i\neq j$, right?

Yep.
$a^T$ has each vector $b_i$ as a row doesn't it?
And $a$ has each $b_i$ as a column.
So if we calculate $a^Ta$ we multiply indeed each $b_i$ row in $a^T$ with each $b_j$ column in $a$. 🤔
 
As for 3:
SInce $B$ is a basis of $V$ then we can write $\displaystyle{v=\sum_{i=1}^n\lambda_ib_i}$.
Then we get $$v\cdot b_j=\left (\sum_{i=1}^n\lambda_ib_i\right )\cdot b_j=\lambda_j$$
That means that the linear combination can be written as $\displaystyle{v=\sum_{i=1}^n(v\cdot b_i)b_i}$.

🤓
 
Yep. :cool:
 
Great! Thanks a lot! 🥳
 

Similar threads

  • · Replies 23 ·
Replies
23
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 10 ·
Replies
10
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 12 ·
Replies
12
Views
2K
  • · Replies 23 ·
Replies
23
Views
2K