# Does multiplication by an invertible matrix preserve convexity?

1. Jul 18, 2014

### sloppyintuit

I am working with polytopes that are defined by half-planes in $\Re^N$. So they are defined by a number of inequalities (half plane representation), but can also be represented by the intersection points of these half planes (vertex representation). Computing the vertices is expensive, so I prefer to do that only when necessary.

I need to apply a transformation from this space into one of the same dimension. The transformation is just a linear one via matrix multiplication.

The matrix happens to be formed as a product of three non-square matrices. I could provide more detail about this, but in the end, the final matrix used for transportation (the product) is invertible.

The polytope I start with is convex. Is there a straightforward argument that shows this transformation will preserve that convexity? When I imagine a planar example, it seems like the stretching necessary to create concave dimples or change the angle of a corner point would have to be nonlinear. I also can't imagine a composition of shift, scale and rotation operations that would do this.

I would like to find an argument from first principles or a check to perform on the matrices. Any ideas or solutions would be appreciated.

2. Jul 18, 2014

### micromass

A convex set is by definition a set $A$ such that for each $x_1,...,x_n\in A$ and for each $t_1,...,t_n\in [0,1]$ with $t_1 + ... + t_n = 1$ we have $t_1x_1 + ... + t_nx_n \in A$.

So what you are trying to do is to take a convex set $A$ and a linear operator $L$ and seeing whether $L(A)$ is still convex.

So take $L(x_1),....,L(x_n)\in L(A)$ and $t_1 , ..., t_n\in [0,1]$ such that $t_1 + ... + t_n = 1$. We know by convexity of $A$ that $t_1 x_1 + ... + t_nx_n\in A$. Since $L$ is linear, it follows that $t_1L(x_1) + ... + t_nL(x_n) = L(t_1x_1 + ... + t_nx_n) \in L(A)$.

Thus it is indeed true that a linear map preserves convexity.

3. Jul 18, 2014

### sloppyintuit

This is excellent. Thanks very much- that solves it for me!