Prove that G is a linear vector space

Click For Summary
SUMMARY

The discussion centers on proving that the set G, consisting of all real functions f(x) that are analytic in the interval [0,1] and satisfy specific boundary conditions, forms a linear vector space. The boundary conditions are defined by the equations f(0) + a*f'(0) = 0 and f(1) + b*f'(1) = 0, where (a,b) belongs to the set D = {(a,b) in R^2: 1 + b - a ≠ 0}. Participants confirm that scalar multiplication of functions in G remains within G, while the challenge lies in proving that the sum of two functions in G also belongs to G. The dimension of this vector space is established as infinite, and the discussion suggests using a complete orthogonal system to find a basis.

PREREQUISITES
  • Understanding of linear vector space definitions and properties
  • Knowledge of analytic functions and their properties
  • Familiarity with boundary value problems
  • Concept of complete orthogonal systems in functional analysis
NEXT STEPS
  • Research the properties of analytic functions and their sums
  • Study the method of unknown coefficients in functional spaces
  • Explore the concept of complete orthogonal systems and their applications
  • Investigate generalized Fourier series and their relation to vector spaces
USEFUL FOR

Mathematicians, students of functional analysis, and anyone interested in the properties of linear vector spaces and analytic functions.

rillchritty
Messages
3
Reaction score
0
Let G be the set of all real functions f(x) each of which is analytic in the interval [0,1] and satisfies the conditions: f(0)+a*f'(0)=0; f(1)+b*f'(1)=0, where (a,b) is a pair of real numbers from the set D={(a,b) in R^2: 1+b-a!=0 (is not equal to zero)}. Prove that the set G is a linear vector space. How to find a basis in the vector space?
 
Physics news on Phys.org
Well, what have you done on this? Do you know the definition of "Linear Vector Space"? If so, you just need to show that this satisfies all of the requirements in that definition. Assuming that addition and scalar multiplication are just ordinary addition of functions and multiplication by numbers, you don't need to show things like "addition is commutative" or the distributive law- those are naturally true.
 
I totally understand what a linear vector space is. I was able to prove that scalar multiplication of a function f(x) in G is an element of G; in other words c*f(x) is in G for all real c. That was almost trivial. I got stuck while trying to prove that the sum of two functions f(x) and g(x) in G is also an element of G. I have been trying to make a breakthrough for like two weeks. I have no idea how to go about finding the basis of the vector space (I wonder if the space has an infinite dimension).
 
I don't think there's a problem. By defining the sum as (f+g)(x) = f(x) + g(x) = h(x), where f, g are from G, we need to prove that h is also from G.
1. h is obviously a function with the same domain as f and g.
2. (f+g)(0) + (f+g)'(0) = f(0) + g(0) + (f(0) + g(0))' = f(0) + f'(0) + g(0) + g'(0) = 0 (this is directly taken from the definition of the sum).
3. Only thing I am not sure about is whether if f(x) and g(x) are analytic h(x) = f(x) + g(x) is also analytic but I think there was a theorem about this or smth (there is one for differentiable functions). You may want to check it out.

My reply may be a bit unclear but I am late for a lecture :) Hope it helps.
 
Last edited:
@batboio Yeah there is a theorem about that: the sum and the product of two analytic functions are also analytic in that interval. If "a" and "b" are fixed, then it's quite easy to prove that f(x)+g(x)=h(x) is also in G; but the problem lies in the fact that the pair (a,b) may be different for different functions. For example for f(x)=e^x, a=-1, b=-1; but for f(x)=sinx, a=0, b=-tan1; and for h(x)=e^x + sinx, a=-1, b= -(1+sin1)/(1+cos1). See the values of "a" and "b" are not constant. What I have been trying to do was to show that for the sum f(x)+g(x)=h(x), we can find the values of "a" and "b" such that 1+b-a!=0 and h(0)+a*h'(0)=0; h(1)+b*h'(1)=0 which would mean that h(x) is also an element of G. But all to no avail!

I was wondering if you have any idea about the dimension of such a vector space (is the dimension infinite?) and how to find the basis in G. Someone suggested that I could use the method of unknown coefficients (I read that up, but couldn't get how to use it in this case).
 
Okay, so you see how to prove that G is a vector space.

You are correct that this is NOT a finite dimensional space. I suspect a basis would not even be countable. Are you actually asked to find a basis or was that your question?
 
Are you sure about a and b not being fixed. Because this seems like a really weak condition for the functions. I mean if you use that 1+b{\neq}a you can get a condition for the functions:
\frac{f(1)}{f^{\prime}(1)}-\frac{f(0)}{f^{\prime}(0)}{\neq}1
Anyway in that case I can't come up with anything. You may try to define the addition differently.
As for the dimension I also don't think it will be finite. You can start here: http://mathworld.wolfram.com/GeneralizedFourierSeries.html
For an arbitrary function if you have a complete orthogonal system you can write the function as a series using the orthogonal system (this system is basically a basis in the functional vector space). In your case if you pick a complete orthogonal system all you need to show is that it's functions satisfy the condition with a and b (i.e. they belong to your vector space).
 

Similar threads

  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 14 ·
Replies
14
Views
3K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 6 ·
Replies
6
Views
4K
  • · Replies 25 ·
Replies
25
Views
3K
  • · Replies 9 ·
Replies
9
Views
4K
  • · Replies 4 ·
Replies
4
Views
3K