Linearizing a nonlinear least squares model

1. Dec 7, 2015

vibe3

I have a nonlinear least squares problem with a set of parameters $\bf{g}$, where I need to minimize the function:
$$\chi^2 = \sum_i \left( y_i - M(t_i ; {\bf g}) \right)^2$$
The $t_i$ are some independent parameters associated with the observations $y_i$ and the model function has the form
$$M(t_i ; {\bf g}) = \sqrt{X(t_i; {\bf g})^2 + Y(t_i;{\bf g})^2}$$
The functions $X(t_i;\bf{g})$ and $Y(t_i;\bf{g})$ are linear in the model parameters $\bf{g}$, ie:
$$X(t_i;{\bf g}) = \sum_k g_k X_k(t_i)$$
and
$$Y(t_i;{\bf g}) = \sum_k g_k Y_k(t_i)$$
In order to solve the nonlinear least squares problem, I need to construct the matrix $J^T J$ at each iteration, where $J$ is the Jacobian:
$$J_{ij} = - \frac{\partial}{\partial g_j} M(t_i;{\bf g})$$
My question is can anyone see a clever way to optimize the computation of the matrix $J^T J$, since all of the $X_k(t_i)$ and $Y_k(t_i)$ can be precomputed? I have millions of observations (rows of the Jacobian) and so its extremely slow to compute each row of the Jacobian and add it into the $J^T J$ matrix. I'm hoping that the linearity of the functions $X$ and $Y$ might allow some way to linearize or precompute large portions of the Jacobian matrix, but I don't see an easy way to do this.

2. Dec 12, 2015