Proof That Every Linear Operator L:ℝ→ℝ Has Form L(x)=cx

LosTacos
Messages
79
Reaction score
0

Homework Statement


Show that every linear operator L:ℝ→ℝ has the form L(x) = cx for some c in ℝ.


Homework Equations


A linear operator in vector space V is a linear transformation whose domain and codomain are both V.


The Attempt at a Solution


If L is a vector space of the real numbers to the real numbers, then I need to show that by multiplying any scalar to an element of the real numbers, then that will yield another real number in the domain of V. Well, assume to the contrary. That is, there exists a scalar c, such that c times an element x in ℝ, yields a number that is not in the reals. Well, the only c that would work is a number that is not contained in the ℝ. However, from hypothesis, c is in ℝ. Therefore, this is a contradiction.

Does this seem right?
 
Physics news on Phys.org
LosTacos said:

Homework Statement


Show that every linear operator L:ℝ→ℝ has the form L(x) = cx for some c in ℝ.


Homework Equations


A linear operator in vector space V is a linear transformation whose domain and codomain are both V.


The Attempt at a Solution


If L is a vector space of the real numbers to the real numbers, then I need to show that by multiplying any scalar to an element of the real numbers, then that will yield another real number in the domain of V. Well, assume to the contrary. That is, there exists a scalar c, such that c times an element x in ℝ, yields a number that is not in the reals. Well, the only c that would work is a number that is not contained in the ℝ. However, from hypothesis, c is in ℝ. Therefore, this is a contradiction.

Does this seem right?

No, that doesn't have anything to do with linearity. Write x=x*1. Now apply linearity to that.
 
Would you use induction. And show that since x=x*1 is true. Assume that it is only true for n. Then show that it is true for n+1.
 
Maybe you can thinkof it this way: your map has a matrix representation with respect to a basis. What is a basis for the reals as a vector space over itself?
 
It would be a basis if it spans the reals and is linearly independent. Therefore, from multiplying by a constant that is an element of the reals, the basis would still span the vector space.
 
LosTacos said:
Would you use induction. And show that since x=x*1 is true. Assume that it is only true for n. Then show that it is true for n+1.

No, that's not what I'm thinking. I'm thinking L(x)=L(x*1)=xL(1). How might that help you?
 
By the communitiy property, you can see that any real number multiplied by x can be separated so that the codomain and domain are both in V.
 
LosTacos said:
By the communitiy property, you can see that any real number multiplied by x can be separated so that the codomain and domain are both in V.

I'm not sure what that has to do with the problem, but sure, L maps V to V. Now what about this constant c you are supposed to prove exists?
 
Because L maps V to V, any constant times a real number will map to another Real number in V
 
  • #10
LosTacos said:
Because L maps V to V, any constant times a real number will map to another Real number in V

Sure any real constant times a real number gives a real number. I don't see how that proves L(x)=cx. L(x)=x^3 also maps real numbers to real numbers. So?
 
  • #11
COuld you say that any real number could be represented by the real number x 1, therefore, from factoring it out, you can show that L(x) = product of c and x.
 
  • #12
LosTacos said:
COuld you say that any real number could be represented by the real number x 1, therefore, from factoring it out, you can show that L(x) = product of c and x.

Why don't you try and say what c is in terms of L? I gave you a big hint a while back.
 
  • #13
c is 1, where x is any real number. thus 1 times x = x for all c in R
 
  • #14
LosTacos said:
c is 1, where x is any real number. thus 1 times x = x for all c in R

L(x)=x is one particular linear operator. There are other operators where c isn't 1. Look, I'll say it again. L(x)=L(x*1)=xL(1). Think about that and what you have to prove.
 
  • #15
Another approach: Any non-zero element in the reals is a basis for R over itself. Then the representation of a linear map L :R-->R must be given by a 1x1 matrix times x...
 
  • #16
Why must it be given by a 1 x 1 matrix. Is this the determinant? ANd what would this muliply
 
  • #17
If you did not understand Dick's "L(x)= L(x(1))= xL(1)", you need to write out the definition of "Linear Transformation". If you know the definitions, this problem should have taken you about two minutes.
 
  • #18
Bacle2 said:
Another approach: Any non-zero element in the reals is a basis for R over itself. Then the representation of a linear map L :R-->R must be given by a 1x1 matrix times x...
This is much harder, since the concept of matrix representation of linear operators is somewhat difficult. It's certainly something that not everyone is familiar with (which is kind of ridiculous, considering that it's one of the most important things in linear algebra, but still). Also, if they don't understand that, then they don't understand what you mean by "R over itself" either.

LosTacos, if you don't understand that Dick has given you the complete solution twice, the problem has to be that you don't understand what the statement that you're supposed to prove means. So I suggest that you focus on that. Make sure that you understand every single word in the statement you're supposed to prove, in particular "linear".
 
  • #19
Because a linear map between R^n and R^m is given by an nxm matrix once you choose a basis. Yes. Any (1-dimensional ) vector, i.e.,any number.
 
  • #20
Fredrik said:
This is much harder, since the concept of matrix representation of linear operators is somewhat difficult. It's certainly something that not everyone is familiar with (which is kind of ridiculous, considering that it's one of the most important things in linear algebra, but still). Also, if they don't understand that, then they don't understand what you mean by "R over itself" either.

LosTacos, if you don't understand that Dick has given you the complete solution twice, the problem has to be that you don't understand what the statement that you're supposed to prove means. So I suggest that you focus on that. Make sure that you understand every single word in the statement you're supposed to prove, in particular "linear".

But, if I understood the argument well, I think Dick has only addressed the homogeneous aspect of a linear map, i.e., the fact that L(cx)=cL(x) ,and not the additivity,i.e., L(x+y)=L(x)+L(y). I'm pretty sure that every homogeneous linear map from R to itself is linear, but I think this needs to be addressed in the argument. But you may be right in that ,maybe matrix representations are not too natural.

I also had the impression from Los Tacos' post that he was familiar with matrices and matrix multiplication.
 
Last edited:
  • #21
Bacle2 said:
But, if I understood the argument well, I think Dick has only addressed the homogeneous aspect of a linear map, i.e., the fact that L(cx)=cL(x) ,and not the additivity,i.e., L(x+y)=L(x)+L(y). I'm pretty sure that every homogeneous linear map from R to itself is linear, but I think this needs to be addressed in the argument. But you may be right in that ,maybe matrix representations are not too natural.
You had me worried there, but the problem only asks us to prove that for all linear ##L:\mathbb R\to\mathbb R##, there's a real number c such that ##L(x)=cx## for all ##x\in\mathbb R##. What Dick said is the only calculation we need to make. The full proof consists of that calculation and some statements about how the calculation proves the theorem.

Matrix representations as a solution to this problem are natural to me, but in my opinion, almost certainly not to a person who asks about this problem.

Bacle2 said:
I also had the impression from Los Tacos' post that he was familiar with matrices and matrix multiplication.
When I studied quantum mechanics (the second QM course, two years after our linear algebra course), we were all familiar with matrices and matrix multiplication, but I was the only one who fully understood the relationship between linear transformations and matrices. Most of the others didn't even recognize ##(AB)_{ij}=\sum_k A_{ik}B_{kj}## as the definition of matrix multiplication.
 
  • #22
Fredrik said:
You had me worried there, but the problem only asks us to prove that for all linear ##L:\mathbb R\to\mathbb R##, there's a real number c such that ##L(x)=cx## for all ##x\in\mathbb R##. What Dick said is the only calculation we need to make. The full proof consists of that calculation and some statements about how the calculation proves the theorem.

Matrix representations as a solution to this problem are natural to me, but in my opinion, almost certainly not to a person who asks about this problem.When I studied quantum mechanics (the second QM course, two years after our linear algebra course), we were all familiar with matrices and matrix multiplication, but I was the only one who fully understood the relationship between linear transformations and matrices. Most of the others didn't even recognize ##(AB)_{ij}=\sum_k A_{ik}B_{kj}## as the definition of matrix multiplication.

Right, my bad, I misread the problem, Dick's solution is correct.

Sorry to nitpick, Fredrik, shouldn't that sum be over the product ## a_{ik}b_{kj}##
where ##A=(a_{ij}) ## , etc.? Maybe I misunderstood/misunderestimated your notation.

EDIT: never mind, I misread again. Let me head out to Starbucks see if caffeine can sharpen me up.
 
Last edited:
Back
Top