POTW A Modified Basis in an Inner Product Space

Click For Summary
In a complex inner product space of dimension n with an orthonormal basis {e_1, ..., e_n}, it is established that if the sum of the norms of vectors v_1, ..., v_n is less than 1, the set {v_1 + e_1, ..., v_n + e_n} forms a basis for the space V. This conclusion relies on the properties of orthonormality and the completeness of the inner product space. The discussion emphasizes the importance of the norm condition in ensuring linear independence and spanning of the modified basis. The analysis highlights the relationship between the original orthonormal basis and the modified vectors. Ultimately, the modified set maintains the structure necessary to serve as a basis for the inner product space.
Euge
Gold Member
MHB
POTW Director
Messages
2,072
Reaction score
245
Given an orthonormal basis ##\{e_1,\ldots, e_n\}## in a complex inner product space ##V## of dimension ##n##, show that if ##v_1,\ldots, v_n\in V## such that ##\sum_{j = 1}^n \|v_j\|^2 < 1##, then ##\{v_1 + e_1,\ldots, v_n + e_n\}## is a basis for ##V##.
 
  • Like
Likes Greg Bernhardt and topsquark
Physics news on Phys.org
Made a mistake
 
We prove linear independence. First, we have ##\| v_i + e_i \| \geq \| e_i \| - \| v_i \| > 0##, so none of the ##\{ v_1 + e_1 , \dots , v_n + e_n \}## is the zero vector. We wish to prove that if

$$
\sum_{i=1}^n \alpha_i (v_i + e_i) = 0
$$

then ##\alpha_i = 0## for ##i = 1 , \dots , n##. We assume that all the ##\alpha##'s being zero is not the only solution and arrive at a contradiction. The above condition implies

$$
\| \sum_{i=1}^n \alpha_i e_i \|^2 = \| \sum_{i=1}^n \alpha_i v_i \|^2 .
$$

We have

\begin{align*}
\| \sum_{i=1}^n \alpha_i v_i \|^2 & \leq (\sum_{i=1}^n | \alpha_i | \cdot \| v_i \| )^2
\nonumber \\
& \leq ( \sum_{i=1}^n |\alpha_i|^2) ( \sum_{i=1}^n \| v_i \|^2 )
\nonumber \\
& < \sum_{i=1}^n | \alpha_i |^2 .
\end{align*}

Note strict inequality. However,

$$
\| \sum_{i=1}^n \alpha_i e_i \|^2 = \sum_{i=1}^n | \alpha_i |^2
$$

and so we have a contradiction. The only way out of the contradiction is for all the ##\alpha##'s to be zero.


A standard result is that any set of ##n## linearly independent vectors in an ##n-##dimensional vector space, ##V##, forms a basis for that space. Proof: Consider

$$
c_0 v+\sum_{i=1}^n c_i (v_i + e_i)
$$

with arbitrary ##c_i## ##(i= 0, 1, \dots , n)## and ##v \in V##. The equation

$$
c_0 v+\sum_{i=1}^n c_i (v_i + e_i) = 0 \qquad (*)
$$

cannot imply

$$
c_i = 0 \qquad i= 0, 1, \dots , n
$$

since that would mean that there are ##n+1## independent vectors in ##V##,

$$
\{ v , v_1 + e_1 , \dots , v_n + e_n \}
$$

and the dimension of ##V## would not be ##n##, but would be at least ##n+1##. Therefore, a set ##\{ c_i \}## exists, with at least two non-zero members, such that ##(*)## is satisfied. One cannot have ##c_0 = 0##, since that would lead to the conclusion that all the ##c_i##'s are zero. Therefore, we can write

$$
v = - \sum_{i=1}^n \frac{c_i}{c_0} (v_i +e_i) .
$$

Therefore, an arbitrary vector ##v## has been expressed as a linear combination of the vectors ##\{ v_1 +e_1 , \dots , v_n + e_n \}##. This proves, in addition to being linearly independent, that they span the space and hence are a basis.
 
  • Like
Likes topsquark, Euge and dextercioby

Similar threads

Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 23 ·
Replies
23
Views
1K
  • · Replies 8 ·
Replies
8
Views
4K
  • · Replies 17 ·
Replies
17
Views
3K
Replies
1
Views
2K
  • · Replies 23 ·
Replies
23
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 25 ·
Replies
25
Views
3K