# Solution of polynomial equations

Suppose there is a set of complex variables

$$\{x_i,\,i=1 \ldots M;\;\;y_k,\,k=1 \ldots N\}$$

and a polynomial equation

$$p(x_i, y_k) = 0$$

Is there a way to prove or disprove for such an equation whether it can be reformulated as

$$f(x_i) = g(y_k)$$

with two functions f and g with

$$\nabla_y f= 0$$
$$\nabla_x g= 0$$

Hey tom.stoer.

I'm not familiar with the polynomial constraint you use: can you point me to some more specific definition? (I'm sorry but I'm only familiar with a univariate polynomial).

You mean the gradient? It's only to stress that f does not depend on y1, y2, ... and g does not depend on x1, x2, ...

No sorry, I mean the p(x_i,y_i) = 0. Is this basically the product of univariate polynomials that is equal to 0?

No, its a general multivariate polynomial in xi and yi, no special construction, no special condition.

The intuition says its right, but we'll go through the formalities.

So you p(x_i,y_i) = 0 and f(xi) - g(y_j) = 0.

Now p(x_i,y_j) = f(xi) - g(yj) = 0.

You apply each individual operator once (with respect to the y's and x's respectively) and you'll get a cancellation on the RHS, but you will also get a condition on the LHS.

The LHS can be written as an expansion with all possible permutations of the x's and y's to various integer powers, and you should write the p(xi,yi) in this form with the coeffecients being variables.

When you do the differentiation on the LHS against both x and y, you should end up with a criteria for the coeffecients and which ones should be zero.

Be using both these differentiability constraints, you can prove the form of f and g since you will get a cancellation of the other when doing the derivative and then you're done.