Let G be the set of all real functions f(x) each of which is analytic in the interval [0,1] and satisfies the conditions: f(0)+a*f'(0)=0; f(1)+b*f'(1)=0, where (a,b) is a pair of real numbers from the set D={(a,b) in R^2: 1+b-a!=0 (is not equal to zero)}. Prove that the set G is a linear vector space. How to find a basis in the vector space?
Well, what have you done on this? Do you know the definition of "Linear Vector Space"? If so, you just need to show that this satisfies all of the requirements in that definition. Assuming that addition and scalar multiplication are just ordinary addition of functions and multiplication by numbers, you don't need to show things like "addition is commutative" or the distributive law- those are naturally true.
I totally understand what a linear vector space is. I was able to prove that scalar multiplication of a function f(x) in G is an element of G; in other words c*f(x) is in G for all real c. That was almost trivial. I got stuck while trying to prove that the sum of two functions f(x) and g(x) in G is also an element of G. I have been trying to make a breakthrough for like two weeks. I have no idea how to go about finding the basis of the vector space (I wonder if the space has an infinite dimension).
I don't think there's a problem. By defining the sum as (f+g)(x) = f(x) + g(x) = h(x), where f, g are from G, we need to prove that h is also from G. 1. h is obviously a function with the same domain as f and g. 2. (f+g)(0) + (f+g)'(0) = f(0) + g(0) + (f(0) + g(0))' = f(0) + f'(0) + g(0) + g'(0) = 0 (this is directly taken from the definition of the sum). 3. Only thing I am not sure about is whether if f(x) and g(x) are analytic h(x) = f(x) + g(x) is also analytic but I think there was a theorem about this or smth (there is one for differentiable functions). You may want to check it out. My reply may be a bit unclear but I am late for a lecture :) Hope it helps.
@batboio Yeah there is a theorem about that: the sum and the product of two analytic functions are also analytic in that interval. If "a" and "b" are fixed, then it's quite easy to prove that f(x)+g(x)=h(x) is also in G; but the problem lies in the fact that the pair (a,b) may be different for different functions. For example for f(x)=e^x, a=-1, b=-1; but for f(x)=sinx, a=0, b=-tan1; and for h(x)=e^x + sinx, a=-1, b= -(1+sin1)/(1+cos1). See the values of "a" and "b" are not constant. What I have been trying to do was to show that for the sum f(x)+g(x)=h(x), we can find the values of "a" and "b" such that 1+b-a!=0 and h(0)+a*h'(0)=0; h(1)+b*h'(1)=0 which would mean that h(x) is also an element of G. But all to no avail! I was wondering if you have any idea about the dimension of such a vector space (is the dimension infinite?) and how to find the basis in G. Someone suggested that I could use the method of unknown coefficients (I read that up, but couldn't get how to use it in this case).
Okay, so you see how to prove that G is a vector space. You are correct that this is NOT a finite dimensional space. I suspect a basis would not even be countable. Are you actually asked to find a basis or was that your question?
Are you sure about a and b not being fixed. Because this seems like a really weak condition for the functions. I mean if you use that [tex]1+b{\neq}a[/tex] you can get a condition for the functions: [tex]\frac{f(1)}{f^{\prime}(1)}-\frac{f(0)}{f^{\prime}(0)}{\neq}1[/tex] Anyway in that case I can't come up with anything. You may try to define the addition differently. As for the dimension I also don't think it will be finite. You can start here: http://mathworld.wolfram.com/GeneralizedFourierSeries.html For an arbitrary function if you have a complete orthogonal system you can write the function as a series using the orthogonal system (this system is basically a basis in the functional vector space). In your case if you pick a complete orthogonal system all you need to show is that it's functions satisfy the condition with a and b (i.e. they belong to your vector space).