Perhaps I am misunderstanding the way to solve the problem here.
In my mind I would
a) show that Inf{ f(x) + g(1-x) } >= inf{f(x)} + inf {g(1-x)}
b) show that g(1-x) = g(x) for the domain [0,1] and hence that Inf{ f(x) + g(1-x) } >= inf{f(x)} + inf {g(x)}
I don't know how to show (a)...
I'm having a bit of trouble with it.
I know that I can use the triangle inequality to say that |f(x) + g(1-x)| <= |f(x)| + |g(1-x)|
I am unsure as to how to proceed from there, I have been thinking perhaps using the Completeness Property to say that |f(x)| + |g(1-x)| >= -[f(x) + g(1-x)] but...
I Just started Analysis 1 this week and I've encountered some tricky problems in the Assignment
Homework Statement
Let f,g : [0,1] -> R be bounded functions.
Prove that inf{ f(x) + g(1-x) : x (element of) [0,1]} >= inf{f(x) : x (element of) [0,1]} + inf{g(x) : x (element of) [0,1]}...