# Linear Transformation to Shrinking/Expand along a given direction

1. Dec 24, 2007

### birulami

Assuming that shrinking/expanding in a given direction is a linear transformation in $R^3$, what would be the matrix to perform it?

To be more precise, given a vector

$$e=\left(\begin{array}{c}e_1\\e_2\\e_3\end{array}\right)$$

with a length of 1, i.e. $$||e||=1$$ and a factor $\lambda$, I am looking for a matrix $A$ such that for every vector x the vector $y=A\cdot$x has a projection on e that is longer than the projection of x by the factor $\lambda$, while all sizes orthogonal to e are kept unchanged.

I came up with a matrix $A$ that contains squares and products of the $e_i$ and, worse, would contain complex numbers for $\lambda<1$. I expected something simpler? Any ideas?

Thanks,
Harald.

2. Dec 24, 2007

### HallsofIvy

Staff Emeritus
A fairly standard way of writing a linear transformation as a matrix is to see what it does to each of your basis vectors. The coefficients of the basis vectors when the result of applying the transformation to $\vec{i}$ is written as a linear combination of the basis vectors are the first column, etc. In this case, applying this linear transformation to $\vec{i}$ is the projection of $\vec{i}$ on $\vec{e}$ multiplied by $\lambda$.

In particular, since $[e_1, e_2, e_]3]$ has length 1, the projection of $\vec{i}$ on it is $[e_1^2, e_1e_2,e_1e3]$, the projection of $\vec{j}$ is $[e_1e_2, e_2^2, e_2e_3]$, and the projection of $\vek{k}$ is $[e_1e_3, e_2e_3, e_3^2]$. The matrix is
$$\left[\begin{array}{ccc} \lambda e_1^2 & \lambda e_1e_2 & \lambda e_1e_3 \\ \lambda e_1e_2 & \lambda e_2^2 & \lambda e_2e_3 \\ \lambda e_1e_3 & \lambda e_2e_3 & \lambda e_3^2 \end{array}\right]$$

If you have only squares and products, as my formula does, I don't see how you could possibly get complex numbers!

3. Dec 24, 2007

### birulami

Something is missing in your matrix, it seems. Try e=(1,0,0). The matrix will have just $\lambda$ in the top left corner. Now apply to vector (1,1,1). The result is $(\lambda,0,0)$, so the coordinates orthogonal to e are not kept but killed.-(

Harald.

4. Dec 24, 2007

### HallsofIvy

Staff Emeritus
Your right- I just calculated the projection onto $[e_1, e_2, e_3]$. I'll try again. The vector [1, 0, 0] has projection $[e_1^2, e_1e_2,e_1e3]$ onto $[e_1, e_2, e_3]$ and orthogonal projection $[e_1^2-1, e_1e_2,e_1e3]$ so your transformation maps [1, 0, 0] to $[(\lambda-1)e_1^2- 1, (\lambda-1)e_1e_2, (\lambda-1)e_1e_3)$ and similarly for the [0, 1, 0] and [0, 0, 1].

Unless I have made another silly error (quite possible) the matrix is:
$$\left[\begin{array}{ccc} (\lambda-1)e_1^2-1 & (\lambda-1)e_1_e2 & (\lambda-1)e_1e_3 \\ (\lambda-1)e_1e_2 & (\lambda-1)e_2^2- 1 &(\lambda-1)e_2e_3 \\(\lambda-1)e_1e_3 & (\lambda-1)e_2e_3 & (\lambda-1)e_3^2-1\end{array}\right]$$

5. Dec 25, 2007

### birulami

Great, thanks. I thought I made a mistake, because I hoped for something simpler.

The reason why I was talking about complex numbers was that I started from a vector $k$ which, in your notation, would now be $k=\sqrt{\lambda-1}\cdot e$. This vector formally combines the direction e and the shrink/expand factor $\lambda$. Obviously the square root wil go complex for $\lambda<1$. But of course the complex numbers disappear again in the matrix itself.

The matrix can be written as $k\cdot k^T-1_{diag}$ --- Not that this tells me anything interesting, though:-)

Seasons Greetings,
Harald.