Perhaps you can use a model of a piecewise Gaussian variable. Suppose the variable has a mean a and different standard deviation for x > a and x < a, i.e. its distribution is:
<br />
\varphi(x) = \left\{\begin{array}{ll}<br />
A_{1} \exp\left(-\frac{(x - a)^{2}}{2 \sigma^{2}_{1}}\right)&, x > a \\<br />
<br />
A_{2} \exp\left(-\frac{(x - a)^{2}}{2 \sigma^{2}_{2}}\right)&, x < a<br />
\end{array}\right.<br />
You have to adjust A_{1} and A_{2} so that:
<br />
E(X) - a = \int_{-\infty}^{\infty}{(x - a) \varphi(x) \, dx} = 0 \Rightarrow A_{1} \int_{0}^{\infty}{t e^{-\frac{t^{2}}{2 \sigma^{2}_{1}}} \, dt} = A_{2} \int_{0}^{\infty}{t e^{-\frac{t^{2}}{2 \sigma^{2}_{2}}} \, dt} \Rightarrow A_{1} \, \sigma^{2}_{1} = A_{2} \, \sigma^{2}_{2}<br />
Of course, the probability density must be normalized:
<br />
\int_{-\infty}^{\infty}{\varphi(x) \, dx} = 1 \Rightarrow A_{1} \, \int^{\infty}_{0}{e^{-\frac{t^{2}}{2\sigma^{2}_{1}} \, dt} + A_{2} \, \int^{\infty}_{0}{e^{-\frac{t^{2}}{2\sigma^{2}_{2}} \, dt} = 1 \Rightarrow \sqrt{\frac{\pi}{2}} \left(A_{1} \, \sigma_{1} + A_{2} \, \sigma_{2} \right) = 1<br />
These two equations allow you to express A_{1/2} in terms of \sigma_{1/2}. Try to find the variance of the variable.
Next, consider the variable:
<br />
\varepsilon_{i} = a \, X_{i} + b \, Y_{i} + c, \; a^{2} + b^{2} = 1, \; i = 1, \ldots, N<br />
If X_{i} and Y_{i} have the above distribution, what is the expectation value and variance for \varepsilon_{i}?
Approximate these variables as having an approximately Normal distribution with the above expectaion values and variances and use the maximum likelihood method, which would reduce to a least-squares method to estimate the parameters of the general linear dependence:
<br />
a \, x + b \, y + c = 0, \; a^{2} + b^{2} = 1<br />