Linear Transformation to Shrinking/Expand along a given direction

Click For Summary

Discussion Overview

The discussion revolves around the formulation of a matrix representing a linear transformation that either shrinks or expands vectors in a specified direction within R^3. Participants explore the mathematical representation of this transformation, particularly focusing on the properties of the transformation regarding projections and orthogonal components.

Discussion Character

  • Technical explanation
  • Mathematical reasoning
  • Debate/contested

Main Points Raised

  • Harald proposes a matrix A for a linear transformation that modifies the projection of a vector x onto a unit vector e by a factor λ while maintaining the sizes of components orthogonal to e.
  • Another participant suggests that the transformation can be derived by examining its effect on basis vectors, leading to a specific matrix form involving λ and the components of e.
  • Harald challenges the initial matrix formulation, noting that it fails to preserve orthogonal components, particularly when testing with specific vectors.
  • A later reply acknowledges the error in the previous matrix and attempts to correct it, proposing a new matrix based on the projections and adjustments for each basis vector.
  • Harald expresses concern about the complexity of the matrix and mentions that the initial formulation led to complex numbers when λ < 1, although these complexities seem to resolve in the final matrix representation.
  • Harald also notes an alternative representation of the matrix involving k, which combines the direction e and the factor λ, hinting at the potential for complex numbers in the formulation.

Areas of Agreement / Disagreement

Participants do not reach a consensus on the correct formulation of the transformation matrix. There are competing views on the appropriate structure of the matrix and its implications for orthogonal components, leading to an ongoing debate about the correct approach.

Contextual Notes

Some limitations include the dependence on the specific definitions of the transformation and the assumptions made about the vector e and the factor λ. The discussion reflects uncertainty regarding the correct matrix representation and the implications of using complex numbers in the context of the transformation.

birulami
Messages
153
Reaction score
0
Assuming that shrinking/expanding in a given direction is a linear transformation in [itex]R^3[/itex], what would be the matrix to perform it?

To be more precise, given a vector

[tex]e=\left(\begin{array}{c}e_1\\e_2\\e_3\end{array}\right)[/tex]

with a length of 1, i.e. [tex]||e||=1[/tex] and a factor [itex]\lambda[/itex], I am looking for a matrix [itex]A[/itex] such that for every vector x the vector [itex]y=A\cdot[/itex]x has a projection on e that is longer than the projection of x by the factor [itex]\lambda[/itex], while all sizes orthogonal to e are kept unchanged.

I came up with a matrix [itex]A[/itex] that contains squares and products of the [itex]e_i[/itex] and, worse, would contain complex numbers for [itex]\lambda<1[/itex]. I expected something simpler? Any ideas?

Thanks,
Harald.
 
Physics news on Phys.org
A fairly standard way of writing a linear transformation as a matrix is to see what it does to each of your basis vectors. The coefficients of the basis vectors when the result of applying the transformation to [itex]\vec{i}[/itex] is written as a linear combination of the basis vectors are the first column, etc. In this case, applying this linear transformation to [itex]\vec{i}[/itex] is the projection of [itex]\vec{i}[/itex] on [itex]\vec{e}[/itex] multiplied by [itex]\lambda[/itex].

In particular, since [itex][e_1, e_2, e_]3][/itex] has length 1, the projection of [itex]\vec{i}[/itex] on it is [itex][e_1^2, e_1e_2,e_1e3][/itex], the projection of [itex]\vec{j}[/itex] is [itex][e_1e_2, e_2^2, e_2e_3][/itex], and the projection of [itex]\vek{k}[/itex] is [itex][e_1e_3, e_2e_3, e_3^2][/itex]. The matrix is
[tex]\left[\begin{array}{ccc} \lambda e_1^2 & \lambda e_1e_2 & \lambda e_1e_3 \\ \lambda e_1e_2 & \lambda e_2^2 & \lambda e_2e_3 \\ \lambda e_1e_3 & \lambda e_2e_3 & \lambda e_3^2 \end{array}\right][/tex]

If you have only squares and products, as my formula does, I don't see how you could possibly get complex numbers!
 
Something is missing in your matrix, it seems. Try e=(1,0,0). The matrix will have just [itex]\lambda[/itex] in the top left corner. Now apply to vector (1,1,1). The result is [itex](\lambda,0,0)[/itex], so the coordinates orthogonal to e are not kept but killed.-(

Harald.
 
Your right- I just calculated the projection onto [itex][e_1, e_2, e_3][/itex]. I'll try again. The vector [1, 0, 0] has projection [itex][e_1^2, e_1e_2,e_1e3][/itex] onto [itex][e_1, e_2, e_3][/itex] and orthogonal projection [itex][e_1^2-1, e_1e_2,e_1e3][/itex] so your transformation maps [1, 0, 0] to [itex][(\lambda-1)e_1^2- 1, (\lambda-1)e_1e_2, (\lambda-1)e_1e_3)[/itex] and similarly for the [0, 1, 0] and [0, 0, 1].

Unless I have made another silly error (quite possible) the matrix is:
[tex]\left[\begin{array}{ccc} (\lambda-1)e_1^2-1 & (\lambda-1)e_1_e2 & (\lambda-1)e_1e_3 \\ (\lambda-1)e_1e_2 & (\lambda-1)e_2^2- 1 &(\lambda-1)e_2e_3 \\(\lambda-1)e_1e_3 & (\lambda-1)e_2e_3 & (\lambda-1)e_3^2-1\end{array}\right][/tex]
 
Great, thanks. I thought I made a mistake, because I hoped for something simpler.

The reason why I was talking about complex numbers was that I started from a vector [itex]k[/itex] which, in your notation, would now be [itex]k=\sqrt{\lambda-1}\cdot e[/itex]. This vector formally combines the direction e and the shrink/expand factor [itex]\lambda[/itex]. Obviously the square root wil go complex for [itex]\lambda<1[/itex]. But of course the complex numbers disappear again in the matrix itself.

The matrix can be written as [itex]k\cdot k^T-1_{diag}[/itex] --- Not that this tells me anything interesting, though:-)

Seasons Greetings,
Harald.
 

Similar threads

  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 3 ·
Replies
3
Views
4K
  • · Replies 20 ·
Replies
20
Views
4K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 1 ·
Replies
1
Views
4K
  • · Replies 10 ·
Replies
10
Views
2K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 1 ·
Replies
1
Views
2K
Replies
3
Views
2K