Undergrad A Modified Basis in an Inner Product Space

Click For Summary
SUMMARY

The discussion focuses on proving that the set ##\{v_1 + e_1,\ldots, v_n + e_n\}## forms a basis for a complex inner product space ##V## of dimension ##n##, given an orthonormal basis ##\{e_1,\ldots, e_n\}## and the condition ##\sum_{j = 1}^n \|v_j\|^2 < 1##. The proof relies on the properties of inner product spaces and the completeness of the basis. The conclusion asserts that the modified vectors maintain linear independence and span the space.

PREREQUISITES
  • Understanding of inner product spaces
  • Knowledge of orthonormal bases
  • Familiarity with vector norms and their properties
  • Basic linear algebra concepts
NEXT STEPS
  • Study the properties of inner product spaces in depth
  • Learn about the Gram-Schmidt process for constructing orthonormal bases
  • Explore the implications of linear independence in vector spaces
  • Investigate applications of modified bases in functional analysis
USEFUL FOR

Mathematicians, students of linear algebra, and researchers in functional analysis who are interested in the properties of inner product spaces and basis transformations.

Euge
Gold Member
MHB
POTW Director
Messages
2,072
Reaction score
245
Given an orthonormal basis ##\{e_1,\ldots, e_n\}## in a complex inner product space ##V## of dimension ##n##, show that if ##v_1,\ldots, v_n\in V## such that ##\sum_{j = 1}^n \|v_j\|^2 < 1##, then ##\{v_1 + e_1,\ldots, v_n + e_n\}## is a basis for ##V##.
 
  • Like
Likes Greg Bernhardt and topsquark
Physics news on Phys.org
Made a mistake
 
We prove linear independence. First, we have ##\| v_i + e_i \| \geq \| e_i \| - \| v_i \| > 0##, so none of the ##\{ v_1 + e_1 , \dots , v_n + e_n \}## is the zero vector. We wish to prove that if

$$
\sum_{i=1}^n \alpha_i (v_i + e_i) = 0
$$

then ##\alpha_i = 0## for ##i = 1 , \dots , n##. We assume that all the ##\alpha##'s being zero is not the only solution and arrive at a contradiction. The above condition implies

$$
\| \sum_{i=1}^n \alpha_i e_i \|^2 = \| \sum_{i=1}^n \alpha_i v_i \|^2 .
$$

We have

\begin{align*}
\| \sum_{i=1}^n \alpha_i v_i \|^2 & \leq (\sum_{i=1}^n | \alpha_i | \cdot \| v_i \| )^2
\nonumber \\
& \leq ( \sum_{i=1}^n |\alpha_i|^2) ( \sum_{i=1}^n \| v_i \|^2 )
\nonumber \\
& < \sum_{i=1}^n | \alpha_i |^2 .
\end{align*}

Note strict inequality. However,

$$
\| \sum_{i=1}^n \alpha_i e_i \|^2 = \sum_{i=1}^n | \alpha_i |^2
$$

and so we have a contradiction. The only way out of the contradiction is for all the ##\alpha##'s to be zero.


A standard result is that any set of ##n## linearly independent vectors in an ##n-##dimensional vector space, ##V##, forms a basis for that space. Proof: Consider

$$
c_0 v+\sum_{i=1}^n c_i (v_i + e_i)
$$

with arbitrary ##c_i## ##(i= 0, 1, \dots , n)## and ##v \in V##. The equation

$$
c_0 v+\sum_{i=1}^n c_i (v_i + e_i) = 0 \qquad (*)
$$

cannot imply

$$
c_i = 0 \qquad i= 0, 1, \dots , n
$$

since that would mean that there are ##n+1## independent vectors in ##V##,

$$
\{ v , v_1 + e_1 , \dots , v_n + e_n \}
$$

and the dimension of ##V## would not be ##n##, but would be at least ##n+1##. Therefore, a set ##\{ c_i \}## exists, with at least two non-zero members, such that ##(*)## is satisfied. One cannot have ##c_0 = 0##, since that would lead to the conclusion that all the ##c_i##'s are zero. Therefore, we can write

$$
v = - \sum_{i=1}^n \frac{c_i}{c_0} (v_i +e_i) .
$$

Therefore, an arbitrary vector ##v## has been expressed as a linear combination of the vectors ##\{ v_1 +e_1 , \dots , v_n + e_n \}##. This proves, in addition to being linearly independent, that they span the space and hence are a basis.
 
  • Like
Likes topsquark, Euge and dextercioby

Similar threads

  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 23 ·
Replies
23
Views
2K
  • · Replies 8 ·
Replies
8
Views
4K
  • · Replies 17 ·
Replies
17
Views
3K
Replies
1
Views
2K
  • · Replies 23 ·
Replies
23
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K