Prove that G is a linear vector space

In summary, the conversation discusses the set G, which consists of real functions that are analytic in the interval [0,1] and satisfy certain conditions. The set G is proven to be a linear vector space, but finding a basis for this space proves to be a challenging task. The dimension of the vector space is suspected to be infinite and finding a basis may involve using a complete orthogonal system.
  • #1
rillchritty
3
0
Let G be the set of all real functions f(x) each of which is analytic in the interval [0,1] and satisfies the conditions: f(0)+a*f'(0)=0; f(1)+b*f'(1)=0, where (a,b) is a pair of real numbers from the set D={(a,b) in R^2: 1+b-a!=0 (is not equal to zero)}. Prove that the set G is a linear vector space. How to find a basis in the vector space?
 
Physics news on Phys.org
  • #2
Well, what have you done on this? Do you know the definition of "Linear Vector Space"? If so, you just need to show that this satisfies all of the requirements in that definition. Assuming that addition and scalar multiplication are just ordinary addition of functions and multiplication by numbers, you don't need to show things like "addition is commutative" or the distributive law- those are naturally true.
 
  • #3
I totally understand what a linear vector space is. I was able to prove that scalar multiplication of a function f(x) in G is an element of G; in other words c*f(x) is in G for all real c. That was almost trivial. I got stuck while trying to prove that the sum of two functions f(x) and g(x) in G is also an element of G. I have been trying to make a breakthrough for like two weeks. I have no idea how to go about finding the basis of the vector space (I wonder if the space has an infinite dimension).
 
  • #4
I don't think there's a problem. By defining the sum as (f+g)(x) = f(x) + g(x) = h(x), where f, g are from G, we need to prove that h is also from G.
1. h is obviously a function with the same domain as f and g.
2. (f+g)(0) + (f+g)'(0) = f(0) + g(0) + (f(0) + g(0))' = f(0) + f'(0) + g(0) + g'(0) = 0 (this is directly taken from the definition of the sum).
3. Only thing I am not sure about is whether if f(x) and g(x) are analytic h(x) = f(x) + g(x) is also analytic but I think there was a theorem about this or smth (there is one for differentiable functions). You may want to check it out.

My reply may be a bit unclear but I am late for a lecture :) Hope it helps.
 
Last edited:
  • #5
@batboio Yeah there is a theorem about that: the sum and the product of two analytic functions are also analytic in that interval. If "a" and "b" are fixed, then it's quite easy to prove that f(x)+g(x)=h(x) is also in G; but the problem lies in the fact that the pair (a,b) may be different for different functions. For example for f(x)=e^x, a=-1, b=-1; but for f(x)=sinx, a=0, b=-tan1; and for h(x)=e^x + sinx, a=-1, b= -(1+sin1)/(1+cos1). See the values of "a" and "b" are not constant. What I have been trying to do was to show that for the sum f(x)+g(x)=h(x), we can find the values of "a" and "b" such that 1+b-a!=0 and h(0)+a*h'(0)=0; h(1)+b*h'(1)=0 which would mean that h(x) is also an element of G. But all to no avail!

I was wondering if you have any idea about the dimension of such a vector space (is the dimension infinite?) and how to find the basis in G. Someone suggested that I could use the method of unknown coefficients (I read that up, but couldn't get how to use it in this case).
 
  • #6
Okay, so you see how to prove that G is a vector space.

You are correct that this is NOT a finite dimensional space. I suspect a basis would not even be countable. Are you actually asked to find a basis or was that your question?
 
  • #7
Are you sure about a and b not being fixed. Because this seems like a really weak condition for the functions. I mean if you use that [tex]1+b{\neq}a[/tex] you can get a condition for the functions:
[tex]\frac{f(1)}{f^{\prime}(1)}-\frac{f(0)}{f^{\prime}(0)}{\neq}1[/tex]
Anyway in that case I can't come up with anything. You may try to define the addition differently.
As for the dimension I also don't think it will be finite. You can start here: http://mathworld.wolfram.com/GeneralizedFourierSeries.html
For an arbitrary function if you have a complete orthogonal system you can write the function as a series using the orthogonal system (this system is basically a basis in the functional vector space). In your case if you pick a complete orthogonal system all you need to show is that it's functions satisfy the condition with a and b (i.e. they belong to your vector space).
 

1. What are the properties of a linear vector space?

A linear vector space, also known as a vector space, is a set of elements that can be added together and multiplied by scalars (usually numbers) to produce other elements in the space. The properties of a linear vector space include closure under addition and scalar multiplication, associativity of addition, commutativity of addition, existence of an additive identity element (zero vector), existence of additive inverses, associativity of scalar multiplication, distributivity of scalar multiplication over vector addition, and distributivity of scalar multiplication over scalar addition.

2. How do you prove that a set G is a linear vector space?

To prove that a set G is a linear vector space, you must show that it satisfies all of the properties listed above. This can be done by providing specific examples and using mathematical proofs to show that the set follows these properties. It is important to also show that these properties hold for any elements in the set, not just a few specific examples.

3. Can a linear vector space have an infinite number of elements?

Yes, a linear vector space can have an infinite number of elements. In fact, many commonly used vector spaces, such as the space of real numbers or the space of polynomials, have infinitely many elements. The key is that the set must satisfy all of the properties of a linear vector space, regardless of the number of elements it contains.

4. What is the difference between a linear vector space and a vector?

A vector is a mathematical object that represents both magnitude and direction. It can be thought of as an arrow pointing from one point to another. A linear vector space, on the other hand, is a set of vectors that satisfies certain properties, as described above. In other words, a vector is an element of a linear vector space, but a linear vector space is not necessarily just one vector.

5. Can a linear vector space have more than one dimension?

Yes, a linear vector space can have any number of dimensions, including more than one. A one-dimensional vector space would contain vectors that can only be represented by a single number (such as the real number line), while a two-dimensional vector space would contain vectors that can be represented by two numbers (such as a plane). Higher dimensional vector spaces are also possible, such as three-dimensional space or even infinite-dimensional space. The important factor is that the set follows all of the properties of a linear vector space.

Similar threads

  • Linear and Abstract Algebra
Replies
5
Views
1K
  • Linear and Abstract Algebra
Replies
8
Views
843
  • Linear and Abstract Algebra
Replies
6
Views
837
  • Linear and Abstract Algebra
Replies
9
Views
522
  • Linear and Abstract Algebra
Replies
3
Views
264
  • Linear and Abstract Algebra
Replies
4
Views
2K
Replies
3
Views
1K
  • Linear and Abstract Algebra
Replies
2
Views
1K
  • Linear and Abstract Algebra
Replies
10
Views
2K
  • Linear and Abstract Algebra
Replies
3
Views
1K
Back
Top