# L1 and L2 norm minimisation

1. Apr 3, 2013

### pamparana

Hi all,

I have a basic optimisation question. I keep reading that L2 norm is easier to optimise than L1 norm. I can see why L2 norm is easy as it will have a closed form solution as it has a derivative everywhere.

For the L1 norm, there is derivatiev everywhere except 0, right? Why is this such a problem with optimisation. I mean, there is a valid gradient everywhere else.

I am really having problems convincing myself why L1 norm is so much harder than l2 norm minimisation. L1 is convex and continupus as well and only has one point which does not have a derivative.

Any explanation would be greatly appreciated!

Thanks,

Luca

2. Apr 3, 2013

### lavinia

If the problem is a linear program then the L1 norm is a single program while the L2 norm is a sequence of linear programs that finds the efficient frontier.