Gram-Schmidt Orthogonalization Process

  • Thread starter Thread starter Mr Davis 97
  • Start date Start date
  • Tags Tags
    Process
Mr Davis 97
Messages
1,461
Reaction score
44

Homework Statement


Find an orthogonal basis for ##\operatorname{span} (S)## if ##S= \{1,x,x^2 \}##, and ##\langle f,g \rangle = \int_0^1 f(x) g(x) \, dx##

Homework Equations

The Attempt at a Solution


So we start by the normal procedure.

Let ##v_1 = 1##. Then ##\displaystyle v_2 = x - \frac{\langle x,1 \rangle}{\| 1 \|^2}(1) = x - \frac{1}{2}##.
Then ##\displaystyle v_3 = x^2 - \frac{\langle x^2,1 \rangle}{\| 1 \|^2}(1) - \frac{\langle x^2,x \rangle}{\| x \|^2}(x) = x^2 - \frac{1}{3} - \frac{3}{4}x##.

But this is not correct, because if I calculate ##\displaystyle \langle 1, x^2 - \frac{1}{3} - \frac{3}{4}x\rangle = \int_0^1 x^2 - \frac{1}{3} - \frac{3}{4}x \, dx = -\frac{3}{2} \ne 0##.

What am I doing wrong?
 
Physics news on Phys.org
Shouldn't the v3 definition use v2 = x-1/2 in the third term instead of x?
 
  • Like
Likes Mr Davis 97
Mr Davis 97 said:

Homework Statement


Find an orthogonal basis for ##\operatorname{span} (S)## if ##S= \{1,x,x^2 \}##, and ##\langle f,g \rangle = \int_0^1 f(x) g(x) \, dx##

Homework Equations

The Attempt at a Solution


So we start by the normal procedure.

Let ##v_1 = 1##. Then ##\displaystyle v_2 = x - \frac{\langle x,1 \rangle}{\| 1 \|^2}(1) = x - \frac{1}{2}##.
Then ##\displaystyle v_3 = x^2 - \frac{\langle x^2,1 \rangle}{\| 1 \|^2}(1) - \frac{\langle x^2,x \rangle}{\| x \|^2}(x) = x^2 - \frac{1}{3} - \frac{3}{4}x##.

But this is not correct, because if I calculate ##\displaystyle \langle 1, x^2 - \frac{1}{3} - \frac{3}{4}x\rangle = \int_0^1 x^2 - \frac{1}{3} - \frac{3}{4}x \, dx = -\frac{3}{2} \ne 0##.
_
What am I doing wrong?
Wrong normalization.
$$v_2=\frac{u_2}{|| u_2 ||}, \\
u_2 = x - <x, v_1> v_1 $$
 
Last edited:
There are two things I don't understand about this problem. First, when finding the nth root of a number, there should in theory be n solutions. However, the formula produces n+1 roots. Here is how. The first root is simply ##\left(r\right)^{\left(\frac{1}{n}\right)}##. Then you multiply this first root by n additional expressions given by the formula, as you go through k=0,1,...n-1. So you end up with n+1 roots, which cannot be correct. Let me illustrate what I mean. For this...
Back
Top