MHB Understanding Vectors: Properties and Applications

mathmari
Gold Member
MHB
Messages
4,984
Reaction score
7
Hey!

Let $1\leq n\in \mathbb{N}$, $V=\mathbb{R}^n$ and $\cdot$ the standard scalar multiplication. Let $b_1, \ldots , b_k\in V$ such that $$b_i\cdot b_j=\delta_{ij}$$
  1. Let $\lambda_1, \ldots , \lambda_k\in \mathbb{R}$. Determine $\displaystyle{\left (\sum_{i=1}^k\lambda_i b_i\right )\cdot b_j}$ for$1\leq j\leq k$.
  2. Show that $b_1, \ldots , b_k$ are linear independent and that $k\leq n$.
  3. Let $k=n$. Show that $B=(b_1, \ldots , b_n)$ is a basis of $V$ and it holds that $\displaystyle{v=\sum_{i=1}^n(v\cdot b_i)b_i}$ for all $v\in V$.
  4. Let $k=n$. Show that $a=(b_1\mid \ldots \mid b_n)\in O_n$.

I have done the following:

For 1:
We have that $$\left (\sum_{i=1}^k\lambda_i b_i\right )\cdot b_j=\sum_{i=1}^k\lambda_i \left (b_i\cdot b_j\right )=\lambda_j$$ or not? :unsure: For 2:
We have that $$\sum_{i=1}^k\lambda_i b_i=0 \ \overset{\cdot b_j}{\longrightarrow} \ \left (\sum_{i=1}^k\lambda_i b_i\right )\cdot b_j=0\cdot b_j \ \overset{\text{ Question } 1.}{\longrightarrow} \ \lambda_j=0$$ for all $1\leq j\leq k$, and so $b_1, \ldots , b_k$ are linear independent.
Is this correct?

How can we show that $k\leq n$? :unsure: For 3:
We have that the vectors of $B$ are linear independent, according to question 2, and the number of vectors equals the dimension of $V$. This imply that $B$ is a basis of $V$, right?
Since $B$ is a basis of $V$, every element of $V$ can be written as a linear combination of the elements of $B$. But why is this linear combination $\displaystyle{v=\sum_{i=1}^n(v\cdot b_i)b_i}$ ? Is this because of the definition of $b_i$, i.e. that $b_i\cdot b_i=1$ ? :unsure:For 4:
To show that the matrix $a$ is orthogonal, we have to show that $a^Ta=I=aa^T$ using the definition os the vectors $b_i$, i.e. that $b_i\cdot b_i=1$ and $b_i\cdot b_j=0$ fr $i\neq j$, right? :unsure:
 
Physics news on Phys.org
mathmari said:
For 1:
We have that $$\left (\sum_{i=1}^k\lambda_i b_i\right )\cdot b_j=\sum_{i=1}^k\lambda_i \left (b_i\cdot b_j\right )=\lambda_j$$ or not?

Hey mathmari!

Yep. :)

mathmari said:
For 2:
Is this correct?

How can we show that $k\leq n$?

Correct yes.

Hmm... I don't see an easy way to show that $k\le n$. :unsure:
Then again, perhaps we can use the property that a set of independent vectors in an $n$-dimensional vector space can have at most $n$ independent vectors? 🤔

mathmari said:
For 3:
We have that the vectors of $B$ are linear independent, according to question 2, and the number of vectors equals the dimension of $V$. This imply that $B$ is a basis of $V$, right?
Since $B$ is a basis of $V$, every element of $V$ can be written as a linear combination of the elements of $B$. But why is this linear combination $\displaystyle{v=\sum_{i=1}^n(v\cdot b_i)b_i}$ ? Is this because of the definition of $b_i$, i.e. that $b_i\cdot b_i=1$ ?

Yep.
Since $B$ is a basis we can write $v=\sum \lambda_j b_j$, can't we?
Suppose we use that to calculate $v\cdot b_i$... 🤔

mathmari said:
For 4:
To show that the matrix $a$ is orthogonal, we have to show that $a^Ta=I=aa^T$ using the definition os the vectors $b_i$, i.e. that $b_i\cdot b_i=1$ and $b_i\cdot b_j=0$ fr $i\neq j$, right?

Yep.
$a^T$ has each vector $b_i$ as a row doesn't it?
And $a$ has each $b_i$ as a column.
So if we calculate $a^Ta$ we multiply indeed each $b_i$ row in $a^T$ with each $b_j$ column in $a$. 🤔
 
As for 3:
SInce $B$ is a basis of $V$ then we can write $\displaystyle{v=\sum_{i=1}^n\lambda_ib_i}$.
Then we get $$v\cdot b_j=\left (\sum_{i=1}^n\lambda_ib_i\right )\cdot b_j=\lambda_j$$
That means that the linear combination can be written as $\displaystyle{v=\sum_{i=1}^n(v\cdot b_i)b_i}$.

🤓
 
Yep. :cool:
 
Great! Thanks a lot! 🥳
 
Thread 'How to define a vector field?'
Hello! In one book I saw that function ##V## of 3 variables ##V_x, V_y, V_z## (vector field in 3D) can be decomposed in a Taylor series without higher-order terms (partial derivative of second power and higher) at point ##(0,0,0)## such way: I think so: higher-order terms can be neglected because partial derivative of second power and higher are equal to 0. Is this true? And how to define vector field correctly for this case? (In the book I found nothing and my attempt was wrong...

Similar threads

  • · Replies 23 ·
Replies
23
Views
2K
  • · Replies 4 ·
Replies
4
Views
1K
  • · Replies 5 ·
Replies
5
Views
1K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 10 ·
Replies
10
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 12 ·
Replies
12
Views
2K
  • · Replies 23 ·
Replies
23
Views
1K