Are These Two Equations Linearly Independent?

  • Thread starter Thread starter turin
  • Start date Start date
  • Tags Tags
    Non-linear
turin
Homework Helper
Messages
2,314
Reaction score
3
Given a system of two equations and two variables, x and y:

x + ay = c
x + by2 = d

I believe this system can be solved uniquely (please correct me if I'm wrong). My question is that of independance. Would one be correct in the statement that these two equations are linearly independent, even though the second equation is not linear?
 
Physics news on Phys.org
Danger, amateur opinion ahead. Does the set of polynomials of two variables of degree less than or equal to 2 form a vector space? If so, I believe the elements x + ay and x + by^2 are linearly independent (which is pretty easy to check, just consider the equation n(x + ay) + m(x + by^2) = 0). I don't think the "linear" in "linear independence" refers to the kind of linearity (or lack of) you "have" in a polynomial such as x + by^2 ;)
 
the equation may have 0,1, or 2 solutions depending on whether a and b are non-zero, as can be seen by substituting for x in the second equation to get a qaudratic in y (unless b is zero) which may have no real solutions. if a and b are both zero there may be no solutions at all. If there is a solution then it almost certainly isn't unique
 
muzza is correct, the linear independence refers to the underlying vector space in which these vectors lie.
 
muzza and matt grime,
Thanks very much for not treating me and my question like a couple of idiots. I do appreciate it. :)

I am a little confused about this underlying vector space idea. Is this the same basic idea as treating functions as vectors so that, for instance, the Legendre polynomials are considered as linearly independent vectors?

Actually, now that I think a little harder about it, I don't think that is quite right. The Legendre polynomials are independent by virtue of order, whereas the two equations in my example system have two independent variables. Should I generalize to:

x + ay = c(x,y)
x + by2 = d(x,y)

and then discuss whether c(x,y) and d(x,y) are linearly independent?

I appologize: I do realize that my question is lacking some element of precision. I just can't put my finger on it (and I suppose the lack of precision is the question).
 
Last edited:
The two variables in the functions also have different orders.

The vector space you have here is R[x,y..,z] The polynomial ring in (arbitrary) variables x,y..,z with real coeffs. The two elements you want to consider are x+ay, and x+by^2. They are linearly dependent if there are real numbers such that p(x+ay)+q(x+by^2) is the zero function, the function that is identically zero for all x and y. you may do this in several ways: by letting x=0 y=1 and x=1 y=0 you can find relations amongst p q a and b that must be satified, and clearly nothing will satisfy them but p=q=0, ie they are linearly independent.
 
Thanks, matt grime. That was a little on the math-heavy side for me, but I think I get the idea. I'll have to explore "the polynomial ring" when I find the time. Thanks again to both of you.
 
linear independence for two objects is pretty easy. if neither is zero, it just means neither one is a multiple of the other.

What you use as multipliers determines what kind of linear independence you are using.

We usually use real numbers for multipliers so independence of x+ay and x+by^2, just means neither is a real multiple of the other.

that is obvious since multiplying by a real number cannot change a y into a y^2.

But the more interesting question you are concerned with is what does this say about the number of solutions?

Given two equations in two variables, the number of simultaneous solutions can be infinite even if they are independent in this sense.

For example x and xy are independent but share the whole y-axis as common solutions. These have a common factor of x, explaining that fact.

At least if we use complex numbers instead of reals we can say always that two polynomials share an infinite number of commono zeroes only if they have a common irreducible factor. (Then it seems to follow also for real numbers.)

the answer to the problem of how many common solutions two polynomials in two variables have, is called Bezout's theorem. I.e. if the two polynomials f,g have no common non constant factors, then they can have at most deg(f)deg(g) common solutions. Tangential solutions can count as more than one however, as usual.

there is a way to calculate the multiplicity of a solution (a,b), as the vector dimension of the local ring R/(f,g), where R is the ring formed from the polynomial ring C[x,y] by allowing as denominators all polynomials not vanishing at the given point (a,b).

I believe the first proof of this theorem was due to Gauss?, and used Euler's? theory of resultants.
 
Last edited:
Back
Top