My apologies if this is in the wrong section.... I wasn't quite sure where to put it. If we have the "Hessian of a Lagrangian" L, then we can represent it as: L = [H g ; g' 0] where the lagrangian is l = f + λh : where h is the constraint. f is the objective function and H is the hessian of the objective function, and g is the gradient of h (you can also think of this as the derivative of l w.r.t. variables of f followed by differentiation w.r.t. λ. g' = is the transpose (so its a row vector). 0 is a scalar since we have only a single lagrangian multiplier. MY QUESTION: Is there a way to cleanly compute the inverse of L? I'm looking for an analytical solution that employs the block structure. The motivation for this question is based on the fact that for a 2-by-2 [a b; c 0] the inverse is [0 1/c; 1/b (-a/bc)]. Obviously I should be able to do an eigenvector-eigenvalue decomposition for L but I was hoping that I might be able to do this to H and then manipulate some sort of symmetry. Ultimately, I want to use the inverse to compute the so-called Newton-Lagrangian step (Newton-step for constrained optimization).