MHB Vector Space Question: Basis Vectors and Relatedness Explained

moyo
Messages
27
Reaction score
0
Could we have two vector spaces each with its own set of basis vectors. but these basis vectors are related according to the following way. A particular set of vectors in the first vector space may exist "all over the place" but when you represent the same information in the second vector space , the discrete vectors in the first space can still be made out in the second space but , line up end to end to form one composite vector in it.
 
Physics news on Phys.org
It is not easy for me to understand what you are exactly asking. In particular:

moyo said:
Could we have two vector spaces each with its own set of basis vectors. but these basis vectors are related according to the following way. A particular set of vectors in the first vector space may exist "all over the place"

What does "all over the place" mean here?

moyo said:
but when you represent the same information in the second vector space ,

What do you mean by "represent"?

moyo said:
the discrete vectors in the first space can still be made out in the second space

What do you mean by "can be made out"? Do you mean that the members of the first basis are linear combinations of members of the second basis?

moyo said:
but , line up end to end to form one composite vector in it.

Are you talking here specifically about concatenating $n$-vectors to form a new vector?

It could help if you could provide an example of what you are thinking about. You could start with real or complex spaces $V$ and $W$ spanned by bases $\{v_1,\ldots,v_m\}$ and $\{w_1,\ldots,w_n\}$ and try to explain your question using symbols and well-defined terms.
 
Krylov said:
It is not easy for me to understand what you are exactly asking. In particular:
What does "all over the place" mean here?
What do you mean by "represent"?
What do you mean by "can be made out"? Do you mean that the members of the first basis are linear combinations of members of the second basis?
Are you talking here specifically about concatenating $n$-vectors to form a new vector?

It could help if you could provide an example of what you are thinking about. You could start with real or complex spaces $V$ and $W$ spanned by bases $\{v_1,\ldots,v_m\}$ and $\{w_1,\ldots,w_n\}$ and try to explain your question using symbols and well-defined terms.

Hi , sorry for being unclear.

My question with an example is ...take a vector space R^2 euclidian space with the forlowwing two functions. f(x)= x /2 and f(x)= 2x +1. These two functions occupy different areas spacially in R^2. Now say we were to manipulate R^2's basis vectors , somehow, to create new basis vectors for another space M, could we manipulate them in such a way that the two functions mentioned above coincide , i.e. occupy the same points within M. And what sort of manipulation would you need to perform. I take it that it would still be some sort of transformation matrix but my question is essentially could you represent any function as a transformation matrix, which seems to be necessary for that to happen. I.e. could the transformation matrix exist in its own space.
 
moyo said:
Hi , sorry for being unclear.

My question with an example is ...take a vector space R^2 euclidian space with the forlowwing two functions. f(x)= x /2 and f(x)= 2x +1. These two functions occupy different areas spacially in R^2. Now say we were to manipulate R^2's basis vectors , somehow, to create new basis vectors for another space M, could we manipulate them in such a way that the two functions mentioned above coincide , i.e. occupy the same points within M. And what sort of manipulation would you need to perform. I take it that it would still be some sort of transformation matrix but my question is essentially could you represent any function as a transformation matrix, which seems to be necessary for that to happen. I.e. could the transformation matrix exist in its own space.

I suppose we could map the two functions to each other and use that function as the basis for the transformation matrix. This would also be scalable to n functions. Then for different alignements of those functions in M we opt to not use the basis vectors in the tranformation matrix vector space as such but composite vectors instead.
 
Soory if i sound naive or am being naive...

I have the following scenario..

I have a number of vectors that when added together give one composite vector. The first vectors represent aspects of a premise while the composite vector represents the conclusion. I have arranged it that way.

Now we have a set of sets of vectors or hyper matrices consiting of "the aspect of the premise and the sum of all of them , i.e. the conclusions, vectors" concatenated.

There are many of these matrices.

Now all these matrices are representations of yet another aspect. I.e. they are all proofs. They share that equivalence.

so in some other vector space they occupy the same point. or are the same vector.

what would be an appropriate set of basis vectors for that last space as a function of the basis vectors we started of with. And we have another kind of relationship between the first vector space and the second one before we reach the third. How can we express it...the one where we concatenate the premise vectors to the conclusions vectors to form another set of vectors in another space.

This is a real problem i am trying to solve. i have ommited information that is irrelevant.
 
Last edited:
I went back and tried to study a bit more ...but with youtube videos, so forgive me if i am still naive...

I have a few questions that i would appreciate answers for

If i have a vector space of the following form. There is a multi dimensional space that these vectors live on, and a particular matrix formed from some vectors has a determinant of zero.

Now are we able to apply curvature to the vector space in order to increase the value of the determinant for zero to something positive? And how would we do this?
 
Vector spaces do not have "curvature". They have vectors, scalars, and the operations of adding to vectors and multiplying a scalar and a vector.

If a matrix, corresponding to some linear transformation, has 0 determinant then the linear transformation has non-trivial kernel. That is, there is exist some subspace such that every vector in that subspace is mapped to the 0 vector. What you can do is restrict the linear transformation to the orthogonal complement of the kernel. The matrix corresponding to that restricted linear transformation will have non-zero kernel.
 
Country Boy said:
Vector spaces do not have "curvature". They have vectors, scalars, and the operations of adding to vectors and multiplying a scalar and a vector.

If a matrix, corresponding to some linear transformation, has 0 determinant then the linear transformation has non-trivial kernel. That is, there is exist some subspace such that every vector in that subspace is mapped to the 0 vector. What you can do is restrict the linear transformation to the orthogonal complement of the kernel. The matrix corresponding to that restricted linear transformation will have non-zero kernel.

I have these videos that seem to say there are such concepts?

https://www.youtube.com/watch?v=NlcvU67YWpQ&list=PLJ8OrXpbC-BNHmhFI4i_EK3vUmuMgZ6wb&index=54

https://www.youtube.com/watch?v=cq3Yf8OGZpo&list=PLJ8OrXpbC-BNHmhFI4i_EK3vUmuMgZ6wb&index=70
 
I have the following problem. i need to track information. Once we complete a phrase..the representation must resolve to zero. Then another phrase is innitiated by adjusting the representation till it is no longer zero, then doing something else to resolve to zero once more.

In the system i had proposed the determinat of zero shows closure of a phrase , then the vectors are transformed so the don't have a determinant of zero anymore by curling the space...then it is resolved by adding other vectors that will make the system resolve to a determinant of zero...i may just be in over my head :)
 
  • #10
moyo said:
Those are NOT talking about "vector spaces". They are talking about vector fields on surfaces or in space. They are completely different concepts. "Vector spaces" are dealt with in Linear Algebra. "Vector fields" are a topic in Differential Geometry.
 
  • #11
moyo said:
I have the following problem. i need to track information. Once we complete a phrase..the representation must resolve to zero. Then another phrase is innitiated by adjusting the representation till it is no longer zero, then doing something else to resolve to zero once more.

In the system i had proposed the determinat of zero shows closure of a phrase , then the vectors are transformed so the don't have a determinant of zero anymore by curling the space...then it is resolved by adding other vectors that will make the system resolve to a determinant of zero...i may just be in over my head :)

I am trying to frame this problem that i have. Would you suggest that it IS possible or not using vector fields form differential geometry. i know i have to do the research on my own , but a little direction would be nice.
 

Similar threads

Back
Top