Compute transformation matrix in nth dimension

Click For Summary

Discussion Overview

The discussion revolves around finding an invertible transformation matrix T that maps a vector of all ones in n-dimensions to a vector with all zeros except for a one in the last position. Participants explore the feasibility of this transformation, particularly in higher dimensions, and consider various approaches to construct such a matrix.

Discussion Character

  • Exploratory
  • Technical explanation
  • Debate/contested
  • Mathematical reasoning

Main Points Raised

  • One participant suggests that finding an invertible rotation matrix T that transforms the vector of all ones to the target vector is computationally intensive in dimensions greater than 5.
  • Another participant questions the possibility of such a transformation, arguing that if T is invertible, then its inverse A must exist, leading to a contradiction regarding the invertibility of A.
  • Some participants note that transformations from R3 to Rn for n > 3 cannot be square and thus cannot be invertible, challenging the notion of a rotation matrix in higher dimensions.
  • Examples of transformation matrices in 2D, 3D, and 4D are provided, illustrating specific cases where the transformation holds.
  • A participant proposes constructing the inverse matrix first, suggesting that it might simplify the process of finding T.
  • Another participant discusses the requirements for the inverse matrix, emphasizing the need for linear independence of columns and proposing a specific construction method using standard basis vectors.

Areas of Agreement / Disagreement

Participants express differing views on the feasibility of the transformation, with some asserting it is impossible under certain conditions while others propose methods to achieve it. The discussion remains unresolved regarding the general applicability of the proposed solutions across different dimensions.

Contextual Notes

There are limitations regarding the assumptions about the dimensions involved, the nature of the transformation (rotation vs. general invertibility), and the computational complexity of finding such matrices. The discussion does not resolve these complexities.

bohdy
Messages
5
Reaction score
0
I have a vector of all ones in n-dimensions. For example (1,1,1) in 3D. I want to find a invertible rotation matrix T that transforms the vector of all ones to the vector (0,0,0,...,0,,1):

Let v be the vector of all ones, and w=(0,0,...,0,1)
Find T such that T.v == wIn low dimension it is easy to find such a vector with the routine below, but computationally intensive when d>5. Is there a better way?

Basic routine: represent the components of T as variables and find a solution to the non-linear problem of finding an invertible matrix T (Det[T]!=0) that satisfies the dot product T.v == w.

Any better ideas?
 
Physics news on Phys.org
I'm no expert here, but I don't see how this is possible.

Assume that the inverse of T exists, and that it is A = inverse(T).The original problem can then be rewritten:
Tv = w
v = Aw

However, the only way this can hold is if the columns of A are (0, 0, ..., 0, v) (where 0 here is the zero vector). Thus A is obviously not invertible, and thus T cannot be either.

If I'm wrong, I'm sorry. I'd love hear where I'm wrong tho :)
 
bohdy said:
I have a vector of all ones in n-dimensions. For example (1,1,1) in 3D. I want to find a invertible rotation matrix T that transforms the vector of all ones to the vector (0,0,0,...,0,,1):
? Any transformation from R3 to Rn where n> 3 is not square and so is not invertible. Nor can it be a "rotation" matrix.

Let v be the vector of all ones, and w=(0,0,...,0,1)
Find T such that T.v == w


In low dimension it is easy to find such a vector with the routine below, but computationally intensive when d>5. Is there a better way?

Basic routine: represent the components of T as variables and find a solution to the non-linear problem of finding an invertible matrix T (Det[T]!=0) that satisfies the dot product T.v == w.

Any better ideas?
 
In 2D:
T= {{1, -1}, {1, 0}}
Tinverse = {{0, 1}, {-1, 1}}
T.{1,1}={0,1}

In 3D:
T= {{1, -1, 0}, {-1, 0, 1}, {1, 1, -1}}
Tinverse = {{1, 1, 1}, {0, 1, 1}, {1, 2, 1}}
T.{1,1,1}={0,0,1}

In 4D:
T= {{-1, -2, 0, 3}, {-2, -1, 0, 3}, {-1, 1, -1, 1}, {2, -2, 1, 0}}
Tinverse = {{-(2/3), 1/3, 1, 1}, {-(5/3), 4/3, 1, 1}, {-2, 2, 0, 1}, {-1, 1, 1,
1}}
T.{1,1,1,1}={0,0,0,1}


Lord Crc said:
I'm no expert here, but I don't see how this is possible.

Assume that the inverse of T exists, and that it is A = inverse(T).The original problem can then be rewritten:
Tv = w
v = Aw

However, the only way this can hold is if the columns of A are (0, 0, ..., 0, v) (where 0 here is the zero vector). Thus A is obviously not invertible, and thus T cannot be either.

If I'm wrong, I'm sorry. I'd love hear where I'm wrong tho :)
 
HallsofIvy said:
? Any transformation from R3 to Rn where n> 3 is not square and so is not invertible. Nor can it be a "rotation" matrix.

I'm looking to map Rn to Rn. See above post. I used the 3D vector (1,1,1) as an example.

...so in R7 I want to map (1,1,1,1,1,1,1) to (0,0,0,0,0,0,1) etc.
 
Ah, was thinking inside out. Thanks :)

Do you want a proper rotation matrix, or just any invertible matrix that transforms v to w?

Your "basic algorithm" seems to indicate the latter, in which case couldn't you then construct the Tinverse first, which is easy, and then invert that to get T?
 
Hi,
Can you elaborate on how to construct Tinverse easily? Isn't the computational difficulty the same, in that finding T such that T.v=w is as hard as finding a Tinv such that Tinv.w=v?

Any matrix will do. I will then be applying the transformation to other vectors in the space (I don't want 2 vectors to map onto 1 - hence the invertibility property)

Lord Crc said:
Ah, was thinking inside out. Thanks :)

Do you want a proper rotation matrix, or just any invertible matrix that transforms v to w?

Your "basic algorithm" seems to indicate the latter, in which case couldn't you then construct the Tinverse first, which is easy, and then invert that to get T?
 
Well, again, I might be wrong here but... The requirements of Tinverse is that the last column has to be all 1's (in order to transform w to v), and in order to be invertible the columns must be linearly independent. One way to construct such a matrix would be to use the standard basis vectors e_i for the first n-1 columns, and the use v as the n'th column:

Tinverse = {{1, 0, 0, 1}, {0, 1, 0, 1}, {0, 0, 1, 1}, {0, 0, 0, 1}}

This matrix can then be inverted and will transform v into w:

T = {{1, 0, 0, -1}, {0, 1, 0, -1}, {0, 0, 1, -1}, {0, 0, 0, 1}}
 
Excellent. That's the kind of solution I was hoping for. Many thanks.

Lord Crc said:
Well, again, I might be wrong here but... The requirements of Tinverse is that the last column has to be all 1's (in order to transform w to v), and in order to be invertible the columns must be linearly independent. One way to construct such a matrix would be to use the standard basis vectors e_i for the first n-1 columns, and the use v as the n'th column:

Tinverse = {{1, 0, 0, 1}, {0, 1, 0, 1}, {0, 0, 1, 1}, {0, 0, 0, 1}}

This matrix can then be inverted and will transform v into w:

T = {{1, 0, 0, -1}, {0, 1, 0, -1}, {0, 0, 1, -1}, {0, 0, 0, 1}}
 

Similar threads

  • · Replies 1 ·
Replies
1
Views
4K
  • · Replies 4 ·
Replies
4
Views
3K
Replies
27
Views
5K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 8 ·
Replies
8
Views
1K
  • · Replies 10 ·
Replies
10
Views
2K
  • · Replies 20 ·
Replies
20
Views
4K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 43 ·
2
Replies
43
Views
8K
  • · Replies 3 ·
Replies
3
Views
3K