You can perform a multilinear regression (least squares) using a design matrix consisting of two dimensional monomials.
For example, a third degree (bi-cubic) approximation can be generated if you consider as basis functions the following:
<br />
[1,x,x^2,x^3, y,yx,yx^2, y^2, y^2x, y^3]<br />
Construct a design matrix with columns computed with your data using this basis. Then regress. You'll obtain the various coefficents for each term. Examine the degree of fit.
Your prediction formula would then be:
<br />
\hat{z}(x,y) = g_0(x) + y \, \left(g_1(x) + y \, \left(g_2(x) + g_3(x) \, y \right) \right)<br />
with
<br />
g_0(x)= c_0+x \, (c_1 + x \, (c_2 + c_3 \, x ))<br />
<br />
g_1(x)= c_4+x \, (c_5 + c_6 \, x)<br />
<br />
g_2(x)= c_7+c_8 \, x<br />
<br />
g_3(x)= c_9<br />
A higher degree of fit requires a higher order polynomial. But this can have disadvantages as the number of terms can easily grow large. Also, you have to be concerned about the number of data and order of fit as well as the possibilities of numerical ill-condition.
However, this approach is not difficult. In fact, it can easily be done in Excel. For example, see the enclosed Excel file where I took your file and performed this analysis.