Mapping ( linear transformation)

In summary: Thanks!In summary, the vector x is orthogonal to the vector v1 if <x,y>=0 for all y contained in V1. The range and kernel of the linear transformation f(x) is <x,v>v and x=0 is in the kernel.
  • #1
reha
7
0
If V is a vector space with an inner space <.,.>. V1 is an non empty subset of V. Vector x is contained in V is said to be orthogonal to v1 if <x,y>=0 for all y contained in V1.

1) if v is contained in V and define the mapping f(x)=<x,v>v. Show f is a linear transformation and describe its range and kernel.

2) if v1 is a subspace of V show that V1 and direct sum of V1 (orthogonal) = V.

I tried to prove these as they were stated in a website. But failed. PLease kindly assist me on this matter.

Thank.
 
Physics news on Phys.org
  • #2
Be glad to help. Show us what you did and we'll make suggestions.
 
  • #3
HallsofIvy said:
Be glad to help. Show us what you did and we'll make suggestions.

for 1)
f(x)=<x,v>v. Show f is a linear transformation.

since <x,v> = <v,x> (symmetry)

hence <x,v>v = <v,x>v = vT x v = x == f(x). Am i right? but i feel its absurd. (vT is v transpose)

range and kernel.
Range ( please help me out. I've no idea how to describe it. thanks)

Kernel. i was thinking, if x= 0, then f(0)=0. Is this right?

2) if v1 is a subspace of V show that V1 and direct sum of V1 (orthogonal) = V.

i tried to verify thinking that, since, V1 is a subspace of V, the direct sum of V1 and its orthogonal has to be in V since V is a vector space.

Please help me out. thanks.
 
  • #4
HallsofIvy said:
Be glad to help. Show us what you did and we'll make suggestions.

for 1)
f(x)=<x,v>v. Show f is a linear transformation.

since <x,v> = <v,x> (symmetry)

hence <x,v>v = <v,x>v = vT x v = x == f(x). Am i right? but i feel its absurd. (vT is v transpose)

range and kernel.
Range ( please help me out. I've no idea how to describe it. thanks)

Kernel. i was thinking, if x= 0, then f(0)=0. Is this right?

2) if v1 is a subspace of V show that V1 and direct sum of V1 (orthogonal) = V.

i tried to verify thinking that, since, V1 is a subspace of V, the direct sum of V1 and its orthogonal has to be in V since V is a vector space.

Please help me out. thanks.
 
  • #5
reha said:
for 1)
f(x)=<x,v>v. Show f is a linear transformation.

since <x,v> = <v,x> (symmetry)

hence <x,v>v = <v,x>v = vT x v = x == f(x). Am i right? but i feel its absurd. (vT is v transpose)
I agree with resppect to the absurdity:smile: How comes the transppose about. There is no mention of a basis in the statement of the problem, so what is this transpose supposed to be? Anyway, you don't need it. And you didn't say what f is.
What you are considering is the transformation [itex]T:\to V[/itex] defined by [itex]T=<x,v>v[/itex]
To show that this transformation is linear, you must show that [itex]T(ax+by)=aTx+bTy[/itex] where a,b are scalars and x,y elements of V. You said < , > is an inner product. What properties does it thus have. Do these properties help you prove linearity of T?
reha said:
range and kernel.
Range ( please help me out. I've no idea how to describe it. thanks)
Whatever you have T acting on, the result is a scalar multiple of v, is it not? Namely, <x,v>v, where <x,v> is said scalar.
reha said:
Kernel. i was thinking, if x= 0, then f(0)=0. Is this right?
This is right und means that the zero vector is in the kernel of T. (This is true for any linear transformation). You want to see if there are any other vectors in the kernel. If x is in ker T then <x,v>v=0, and since v is not the zero vector, this means <x,v>=0. In your first post, you stated something about orthogonality, maybe this helps here...
reha said:
2) if v1 is a subspace of V show that V1 and direct sum of V1 (orthogonal) = V.

i tried to verify thinking that, since, V1 is a subspace of V, the direct sum of V1 and its orthogonal has to be in V since V is a vector space.
To show that [itex]V=V_1\oplus V_1^\bot[/itex] you have to show that every vector x in V can be uniquely decomposed as x=v+u where v is in V1 and u is in the orthogonal complement of V1. Clearly, x=<x,v>v+(x-<x,v>v) is such a decomposition. To prove uniqueness, assume that x=v'+u' is another such decomposition. It follows that v-v'=u'-u; the vector on the left lies in V1, the vector on the right side in its complement. Since the only vector two orthogonal subspaces have in common, is the zero vector (prove it!), we see that v=v' and u=u', i.e. the decomposition is unique.
reha said:
Please help me out. thanks.
 
Last edited:
  • #6
Pere Callahan said:
I agree with resppect to the absurdity:smile: How comes the transppose about. There is no mention of a basis in the statement of the problem, so what is this transpose supposed to be? Anyway, you don't need it. And you didn't say what f is.
What you are considering is the transformation [itex]T:\to V[/itex] defined by [itex]T=<x,v>v[/itex]
To show that this transformation is linear, you must show that [itex]T(ax+by)=aTx+bTy[/itex] where a,b are scalars and x,y elements of V. You said < , > is an inner product. What properties does it thus have. Do these properties help you prove linearity of T?



Kernel. i was thinking, if x= 0, then f(0)=0. Is this right?

2) if v1 is a subspace of V show that V1 and direct sum of V1 (orthogonal) = V.

i tried to verify thinking that, since, V1 is a subspace of V, the direct sum of V1 and its orthogonal has to be in V since V is a vector space.

Please help me out. thanks.
[/QUOTE]

f is a mapping. where mapping f(x)= <x,v>v

Inner product properties: linearity, symmetry (which i had used) and definiteness. should i need to use all of them?

Thanks.
 
  • #7
reha said:
f is a mapping. where mapping f(x)= <x,v>v

Inner product properties: linearity, symmetry (which i had used) and definiteness. should i need to use all of them?

Thanks.
Ok, I called this f T, but this doesn't make any differnece, of course. Yes, you should use in particular the linearity of the inner product to prove linearity of f (or T).
 
  • #8
Pere Callahan said:
Ok, I called this f T, but this doesn't make any differnece, of course. Yes, you should use in particular the linearity of the inner product to prove linearity of f (or T).

Thanks. Can you please tell me if the following is correct:

f(x)= <x,v>v
say a is scalar. and s be a vector. hence

f(x)= <(ax+s), v>v
= axv^2 + sv^2

Thanks.

Please correct my mistake. Btw, why can't i use the symmetry property which i had used earlier?
 
  • #9
Whatever you have T acting on, the result is a scalar multiple of v, is it not? Namely, <x,v>v, where <x,v> is said scalar.
Please explain again. i don't get why it is scalar? i thought x and y are vectors.

This is right und means that the zero vector is in the kernel of T. (This is true for any linear transformation). You want to see if there are any other vectors in the kernel. If x is in ker T then <x,v>v=0, and since v is not the zero vector, this means <x,v>=0. In your first post, you stated something about orthogonality, maybe this helps here...

Orthogonality. Can i say that, if <x,v>=0,(since v is contained in V) hence there is a zero vector in f defines ker T.

Thanks.
 
  • #10
reha said:
f(x)= <x,v>v
say a is scalar. and s be a vector. hence

f(x)= <(ax+s), v>v
= axv^2 + sv^2

I'm sorry, this shows that your understanding of the terms you are using and objects your are working with is insufficient for me to able to help you any further on this question.
In a forum, I (and presumably many others) can try to resolve specific problems or questions, but I cannot make up for a whole course on linear algebra nor for the basics of mathematical insight and comprehension which is acquired only with time and practice, not by being told in a step-by-step fashion how to solve this or that specific exercise.
Good luck.
 

1. What is mapping in linear transformation?

Mapping in linear transformation refers to the process of transforming one set of data, known as the input, to another set of data, known as the output, using a mathematical function called a linear transformation. This function takes in input values and produces corresponding output values based on a set of rules or equations.

2. How is mapping used in scientific research?

Mapping is used in scientific research to analyze and understand patterns in data. It allows researchers to visualize and interpret complex data sets, identify relationships between variables, and make predictions based on the data. Mapping is particularly useful in fields such as geology, geography, and biology, where spatial data is commonly used.

3. What are some common types of linear transformations?

Some common types of linear transformations include scaling, which involves changing the size or magnitude of the data; translation, which involves shifting the data along a certain direction; and rotation, which involves changing the orientation of the data. Other types include reflection, shearing, and projection, each of which has specific mathematical rules and applications.

4. How does mapping differ from other types of transformations?

Mapping specifically refers to the process of transforming data using a linear function, where each input value corresponds to one and only one output value. This is different from nonlinear transformations, where one input value can have multiple output values. Additionally, mapping involves preserving the shape of the original data, while other transformations may distort or change the shape.

5. What are some real-world applications of mapping in linear transformation?

Mapping is used in a variety of real-world applications, such as cartography, where it is used to create maps of geographical features; image processing, where it is used to manipulate and enhance images; and data analysis, where it is used to understand relationships between variables. It is also commonly used in computer graphics, engineering, and physics, among other fields.

Similar threads

  • Linear and Abstract Algebra
Replies
3
Views
1K
Replies
4
Views
863
Replies
3
Views
1K
  • Linear and Abstract Algebra
Replies
1
Views
1K
  • Linear and Abstract Algebra
Replies
3
Views
1K
  • Linear and Abstract Algebra
2
Replies
52
Views
2K
  • Linear and Abstract Algebra
Replies
6
Views
1K
  • Linear and Abstract Algebra
Replies
21
Views
1K
  • Linear and Abstract Algebra
Replies
9
Views
2K
  • Linear and Abstract Algebra
Replies
3
Views
1K
Back
Top