Dual vector space - Lagrange Interpolating Polynomial

Click For Summary
The discussion centers on the Lagrange Interpolating Polynomial and its properties within the context of dual vector spaces. It establishes that the set of evaluation functionals forms a basis for the dual space of polynomials, proving the linear independence of these functionals. The unique polynomial that satisfies the interpolation conditions is derived, reinforcing the relationship between the polynomial's coefficients and its evaluations at specific points. Additionally, the discussion explores the integral representation of polynomials in terms of their evaluations at these points, leading to specific values for the coefficients in the case of quadratic polynomials. The exercise is noted to be sourced from a linear algebra text, highlighting its relevance to the topic.
SqueeSpleen
Messages
138
Reaction score
5
I think I solved it a week ago, but I didn't write down all the things and I want to be sure of doing the things right, plus the excersise of writing it here in latex helps me a loot (I wrote about 3 threads and didn't submited it because writing it here clarified me enough to find the answer for my self xD). So feel free to correct me.

Lets be \alpha _{0},...,\alpha _{n} \in K, \alpha _{i} \neq \alpha _{j} if i\neq j

To every i, 0 \leq i \leq n, we define \varepsilon _{\alpha _i}:K_{n}[X]\rightarrow K as \varepsilon _{\alpha _i}(P)=P(\alpha _i).

i) Prove that B = \left \{ \epsilon _{\alpha _{0}},...,\epsilon _{\alpha _{n}} \right \} is a basis of (K_{n}[X])^{*}

ii) Let's be B = \left \{ P_{0},...,P_{n} \right \} the basis of K_{n}[X] such that B^{*}=B_{1}. Prove that the polynomial

P = \sum _{i=0}^{n} \beta _i . P_i

is the only polynomial in K[X] of grade minor or equal to n such that,

\forall i, 0 \leq i \leq n, P(\alpha _{i}) = \beta _{i}

This polynomial is named the Lagrange's Interpolating Polynomial (irrelevant :P).

iii) Prove that there're real numbers \left \a_{0},...,a_{n}{ \right \} such that, to all P \in \mathbb{R}_{n}[X]
\int _{0}^{1}P(x)dx=\sum _{i=0}^{n}a_{i}.P(\alpha _{i})
Find a_{0}, a_{1} and a_{2} when n = 2, a_{0}=1, a_{1}=\frac{1}{2} and a_{2}=0

Relevant proposition:
Lets be V a K-vector space of dimensión n, and let's be V* it's dual space.
Lets be B_{1}=\left \{ \varphi _{1},...\varphi _{n} \right \} a basis of V*. Then there's only one basis B = \left \{ v _{1},...v _{n} \right \} of V such that:
B^{*}=B_{1}

i) Suppose that:
\sum _{i=0}^{n}\alpha _{i}\varepsilon _{i}=0
Then, to all P \in K_n[X], we have: \sum _{i=0}^{n}\alpha _{i}\varepsilon _{i}(P)=\sum _{i=0}^{n}P(\alpha _{i})=0
To every \forall i, 0 \leq i \leq n
We take the polynomial:
\prod _{i=0,{i\neq }j}^{n}(X-\alpha _{i})
And evaluating it we know \alpha _{j}=0, because P(\alpha _{i})=0 to every i≠j (P(\alpha _{j}) can't be zero because \alpha _{j} \neq \alpha _{i} if i≠j
Then, we have the linear independency of the set, and as n+1 elements, then it's generated space has dimention n+1 and it's a subset of (K_{n}[X])*, so it's a basis of (K_{n}[X])* (because it also has dimension n+1).
ii) So we know B_{1} is a basis of K_n[X], then for the proposition we know that there's only one basis
B= \left \{ P_{0},...,P_{n} \right \} such that B^{*}=B_{1}
So, by the definition of dual basis, \epsilon _{\alpha _{i}}(P_{j})=P_{j}(\epsilon _{\alpha _{0}})=
1 if i=j
0 if i≠j
Then P(\epsilon _{\alpha _{j}})=\sum _{i=0}^{n} \beta _i . P_i(\epsilon _{\alpha _{j}})=\beta _{i}

iii)
I think I'm stuck in 3, but I'll think later after sleeping, I'm probably failing due to being so sleepy.




 
Last edited:
Physics news on Phys.org
I didn't find the edit button in my first post, I assume someone blocked it so I bump the thread when I finish it instead of buring it in the bottom of the forum's abyss.
If P=\sum _{i=0}^{n}\beta _{i} P_{i} then
\int _{0}^{1}\sum _{i=0}^{n} \beta _{i} . P_{i} (x)dx = \sum _{i=0}^{n} \beta _{i}.\int _{0}^{1}P_{i} (x)dx
\sum _{i=0}^{n}a_i.P(\alpha _{i})=\sum _{i=0}^{n}a_i.\sum _{j=0}^{n} \beta _{i} P_{i} (\alpha _{i})=
\sum _{i=0}^{n}a_{i}.\beta _{i}
Let's be a_{i}=\int _{0}^{1}P_{i} (x)dx to all 0 \leq i \leq n
Then
\sum _{i=0}^{n} \frac{\beta _{i}}{i+1}=\sum _{i=0}^{n}a_{i}.\beta _{i}Let's do find a_{0}, a_{1} and a_{2} when n = 2, a_{0}=1, a_{1}=\frac{1}{2} and a_{2}=0
\left\{\begin{matrix}<br /> P_{0}(\alpha _{0})=1\\<br /> P_{0}(\alpha _{1})=0\\<br /> P_{0}(\alpha _{2})=0 <br /> \end{matrix}\right.
\left\{\begin{matrix}<br /> P_{0}(1)=1\\<br /> P_{0}(\frac{1}{2})=0\\<br /> P_{0}(0)=0 <br /> \end{matrix}\right.
As we know two roots of P_{0} we only need to multiply it for a constant to make it equal to 1 when evaluated in 1.
-\frac{(X-\frac{1}{2})(X)}{2}

<br /> \left\{\begin{matrix}<br /> P_{1}(1)=0\\<br /> P_{1}(\frac{1}{2})=1\\<br /> P_{1}(0)=0 <br /> \end{matrix}\right.
-4(X-1)(X)

\left\{\begin{matrix}<br /> P_{2}(1)=0\\<br /> P_{2}(\frac{1}{2})=0\\<br /> P_{2}(0)=1 <br /> \end{matrix}\right.

2(X-\frac{1}{2})(X-1)

So we only need to evaluate the integrals of these polynomials.
Then:
a_{0}=-\frac{1}{24}
a_{1}=\frac{2}{3}
a_{2}=\frac{1}{6}

PD: The excersise is from http://mate.dm.uba.ar/~jeronimo/algebra_lineal/AlgebraLineal.pdf page 115 Ejercise 16, this isn't the book used in my Linear Algebra's course in the dual vector space subject but I find the ejercises were pretty interesting to do.
 
Last edited:
Question: A clock's minute hand has length 4 and its hour hand has length 3. What is the distance between the tips at the moment when it is increasing most rapidly?(Putnam Exam Question) Answer: Making assumption that both the hands moves at constant angular velocities, the answer is ## \sqrt{7} .## But don't you think this assumption is somewhat doubtful and wrong?

Similar threads

Replies
4
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 13 ·
Replies
13
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
Replies
3
Views
1K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
Replies
8
Views
2K
  • · Replies 13 ·
Replies
13
Views
3K