# Approximation theory problem: show nonexistence of best approximation

1. Feb 4, 2012

### Reverberant

1. The problem statement, all variables and given/known data
Problem 1.8 here (Link to Google books)
Clarification: C[0,1] are the continuous functions on the interval [0,1] and let S denote the set of points in the problem, as it is stated (can't tell if it's a S or a P in the book).

2. Relevant equations
Have I understood the problem correctly, if I say that one way to solve the problem would be to choose the function f such that regardless of what a in A I choose, I can always find another a=a' in A such that max|f(x)-a'(x)| is smaller than max|f(x)-a'(x)| (where the max is taken over all x in S). How do I go about choosing such a function f? What should I be thinking about? This is where I'm stuck, so I'm afraid that I can't post any attempt at a solution yet.
1. The problem statement, all variables and given/known data

2. Relevant equations

3. The attempt at a solution

Share this great discussion with others via Reddit, Google+, Twitter, or Facebook

Can you offer guidance or do you also need help?