# Optimization of a matrix with an objective function (for ML)

1. Jan 9, 2012

Hi.

I need to do max. likelihood for an objective likelihood function L (minimize it), and the target is a matrix. ie:

$$min_{K}\,\, L(K)$$

For example:
K is, let's say, of size 3x3 and with initial value of ones. ($$k_{i,j}=1\forall i,j$$)
L is $$L=\Vert grad(K) \Vert$$ or $$L=\Vert K \Vert^{1.1}$$.

I know how to do gradient descend etc., but here I need to minimize the function L by iterating over K and I don't really know how to approach it. I'd expect something of this sort:
$$K\,:=\,K+f(grad(L))$$, but I don't know what.

Appreciate any help.

Last edited: Jan 9, 2012