What is the best way of describing isomorphism between two vector

1. Oct 28, 2012

matqkks

What is the best way of describing isomorphism between two vector spaces? Is there a real life analogy of isomorphism?

2. Oct 28, 2012

chiro

Re: Isomorphism

Hey matqkks.

Think of a compression algorithm that takes one vector and reduces it to a smaller vector.

In a compression process that is lossless (as opposed to lossy) we need to preserve the exact non-contextual description of the data when we uncompress the data and the process of going from un-compressed to compressed and back-again is one example of that.

You can think of the compressed and uncompressed data formats as vector spaces (or more correctly, realizations of those spaces) where you are going from one space to another.

This is a really crude example of visualizing a real example, but in one sense you can treat the data as vectors if its interpreted in the right way.

The real thing about this though relates to the intrinsic dimension of the data and this is a very difficult thing because linear algebra only allows us to reduce something to a basis if everything is linear.

If there is a non-linear relationship or a your space has a transformation to something that looks linear that can be reduced further than what it has been, then you may miscalculated the dimension under some other transformation or more correctly under som other basis.

Vector spaces specifically deal with things that act like linear objects, but transformations in general from one space to another don't necessarily have to look linear even if the individual spaces themselves with their scalar multiplication, and addition operations within the individual spaces themselves are in fact linear and behave like arrows.

3. Oct 29, 2012

Fredrik

Staff Emeritus
Re: Isomorphism

The idea behind the concept of "isomorphism" is that two isomorphic structures (groups, rings, fields, vector spaces, etc.) are "essentially the same thing". If one structure is useful in a theory of physics for example, any structure that's isomorphic to it will do the job just as well.

A vector space consists of a triple (V,S,A) where V is the underlying set, S is the scalar multiplication operation, and A is the addition operation. The standard notation for the latter two is of course S(a,x)=ax and A(x,y)=x+y. If the vector space is (V,S,A), what we really mean when we write something like ax+by=z, is A(S(a,x),S(b,y))=z.

To say that (V,S,A) is isomorphic to (W,T,B) is to say that there's a bijection $f:V\to W$ such that every single statement about members of V will remain equally true (or equally false) if we make the substitutions V→W, S→T, A→B, x→f(x). (We have to do that last one for every free variable that represents a member of V, not just x). For example, if we make those substitutions in the statement A(S(a,x),S(b,y))=z, we get B(T(a,f(x)),T(b,f(y)))=f(z). If we make them in the statement
For all x in V, there's a y in V such that A(S(a,x),S(b,y))=z,​
we get
For all x in W, there's a y in W such that B(T(a,x),T(b,y))=f(z).​
Note that it wouldn't make sense to make the substitution x→f(x) and y→f(y) here, because x is the target of a "for all" and y is the target of a "there exists".

I don't know if that was very enlightening. The concepts tend to get obscured by the notation. It might be better to consider one of the simplest possible examples. Consider the group ({1,-1},*) where * is the multiplication operation on the set of real numbers, restricted to the subset {-1,1}. Since the underlying set has only two members, we can easily compute all the products:
1*1=1
1*(-1)=-1
(-1)*1=-1
(-1)*(-1)=1
Now compare this to the group ({0,1},+) where + denotes addition modulo 2. These are all the possible sums:
0+0=0
0+1=1
1+0=1
1+1=0
Just by looking at these results, you can see that the second list is just the first list in a different notation. If we take the first group and write 0 instead of 1, 1 instead of -1, and + instead of *, the first list turns into the second. This is a good reason to think of the two groups as "essentially the same".

The concept of "isomorphism" is meant to make the idea of "essentially the same" mathematically precise, in a way that makes sense even when the underlying sets of the two structures are infinite.

Last edited: Oct 29, 2012
4. Oct 30, 2012

Jim Kata

Re: Isomorphism

Isomorphisms between vector spaces are given by linear operators. Look up the fundamental theorem of linear algebra and the rank nullity theorem. These are the vector space equivalents of the first isomorphism theorem. A simple example that might give you some idea is think of the dot product. Let V be a finite dimensional vector space and let V* be the set of all linear functionals from the vector space V to $$\mathbb{R}$$ you can show quite easily that V* is a vector space and that dim(V*)=dim(V) for finite dimensional V and this implies $$V \cong V^*$$.