A problem on finding orthogonal basis and projection

visharad
Messages
51
Reaction score
0
Use the inner product <f,g> = integral f(x) g(x) dx from 0 to 1 for continuous functions on the inerval [0, 1]

a) Find an orthogonal basis for span = {x, x^2, x^3}

b) Project the function y = 3(x+x^2) onto this basis.
---------------------------------------------------------
I know the following:
Two vectors are orthogonal if their inner product = 0
A set of vectors is orthogonal if <v1,v2> = 0 where v1 and v2 are members of the set and v1 is not equal to v2
If S = {v1, v2, ..., vn} is a basis for inner product space and S is also an orthogonal set, then S is an orthogonal basis.

Regarding projection, I know that if W is a finite dimensional subspace of an inner product space V and W has an orthogonal basis S = {v1, v2, ..., vn} and that u is any vector in V then,
projection of u onto W = <u, v1> v1/||v1||^2 + <u, v2> v2/||v2||^2 + <u, v3> v3/||v3||^2 + ...<u, vn> vn/||vn||^2

I can calculate integrals, but I really do not know how to fit all these together for this problem. I am not sure how to start.
 
Mathematics news on Phys.org
Either you guess a basis which seems possible with a few try and error attempts, or you formally apply the Gram-Schmidt orthogonalization algorithm. If you have the new basis ##\{\,f(x),g(x),h(x)\,\}## then write ##3x+3x^2 = \alpha_1f(x)+\alpha_2g(x)+\alpha_3h(x)\,.##
 
Insights auto threads is broken atm, so I'm manually creating these for new Insight articles. In Dirac’s Principles of Quantum Mechanics published in 1930 he introduced a “convenient notation” he referred to as a “delta function” which he treated as a continuum analog to the discrete Kronecker delta. The Kronecker delta is simply the indexed components of the identity operator in matrix algebra Source: https://www.physicsforums.com/insights/what-exactly-is-diracs-delta-function/ by...
Fermat's Last Theorem has long been one of the most famous mathematical problems, and is now one of the most famous theorems. It simply states that the equation $$ a^n+b^n=c^n $$ has no solutions with positive integers if ##n>2.## It was named after Pierre de Fermat (1607-1665). The problem itself stems from the book Arithmetica by Diophantus of Alexandria. It gained popularity because Fermat noted in his copy "Cubum autem in duos cubos, aut quadratoquadratum in duos quadratoquadratos, et...
I'm interested to know whether the equation $$1 = 2 - \frac{1}{2 - \frac{1}{2 - \cdots}}$$ is true or not. It can be shown easily that if the continued fraction converges, it cannot converge to anything else than 1. It seems that if the continued fraction converges, the convergence is very slow. The apparent slowness of the convergence makes it difficult to estimate the presence of true convergence numerically. At the moment I don't know whether this converges or not.

Similar threads

Back
Top