Are These Two Equations Linearly Independent?

  • Context: Undergrad 
  • Thread starter Thread starter turin
  • Start date Start date
  • Tags Tags
    Non-linear
Click For Summary

Discussion Overview

The discussion revolves around the linear independence of two equations involving polynomials: x + ay = c and x + by² = d. Participants explore the implications of linear independence in the context of vector spaces, polynomial degrees, and the uniqueness of solutions to the system of equations.

Discussion Character

  • Exploratory
  • Technical explanation
  • Debate/contested
  • Mathematical reasoning

Main Points Raised

  • Some participants propose that the system can be solved uniquely, while others express uncertainty about the uniqueness of solutions based on the values of a and b.
  • One participant questions whether the set of polynomials of degree less than or equal to 2 forms a vector space and suggests that the equations x + ay and x + by² are linearly independent.
  • Another participant clarifies that linear independence refers to the underlying vector space, suggesting that the independence of the equations may depend on the context of polynomial functions.
  • There is a discussion about the nature of linear independence, with some arguing that it means neither equation can be expressed as a multiple of the other.
  • Concerns are raised about the precision of the original question, with a participant considering whether to generalize the equations to functions of two variables.
  • One participant introduces Bezout's theorem, discussing the relationship between the degrees of polynomials and the number of common solutions, while noting that independence does not necessarily imply a unique solution.

Areas of Agreement / Disagreement

Participants express differing views on the uniqueness of solutions and the implications of linear independence. There is no consensus on the precise definitions or implications of these concepts, indicating that multiple competing views remain.

Contextual Notes

Participants highlight the complexity of the definitions involved, including the dependence on the context of polynomial functions and the nature of the vector space in which the equations reside. There are unresolved questions regarding the assumptions about the coefficients a and b and their impact on the solutions.

turin
Homework Helper
Messages
2,314
Reaction score
3
Given a system of two equations and two variables, x and y:

x + ay = c
x + by2 = d

I believe this system can be solved uniquely (please correct me if I'm wrong). My question is that of independence. Would one be correct in the statement that these two equations are linearly independent, even though the second equation is not linear?
 
Physics news on Phys.org
Danger, amateur opinion ahead. Does the set of polynomials of two variables of degree less than or equal to 2 form a vector space? If so, I believe the elements x + ay and x + by^2 are linearly independent (which is pretty easy to check, just consider the equation n(x + ay) + m(x + by^2) = 0). I don't think the "linear" in "linear independence" refers to the kind of linearity (or lack of) you "have" in a polynomial such as x + by^2 ;)
 
the equation may have 0,1, or 2 solutions depending on whether a and b are non-zero, as can be seen by substituting for x in the second equation to get a qaudratic in y (unless b is zero) which may have no real solutions. if a and b are both zero there may be no solutions at all. If there is a solution then it almost certainly isn't unique
 
muzza is correct, the linear independence refers to the underlying vector space in which these vectors lie.
 
muzza and matt grime,
Thanks very much for not treating me and my question like a couple of idiots. I do appreciate it. :)

I am a little confused about this underlying vector space idea. Is this the same basic idea as treating functions as vectors so that, for instance, the Legendre polynomials are considered as linearly independent vectors?

Actually, now that I think a little harder about it, I don't think that is quite right. The Legendre polynomials are independent by virtue of order, whereas the two equations in my example system have two independent variables. Should I generalize to:

x + ay = c(x,y)
x + by2 = d(x,y)

and then discuss whether c(x,y) and d(x,y) are linearly independent?

I appologize: I do realize that my question is lacking some element of precision. I just can't put my finger on it (and I suppose the lack of precision is the question).
 
Last edited:
The two variables in the functions also have different orders.

The vector space you have here is R[x,y..,z] The polynomial ring in (arbitrary) variables x,y..,z with real coeffs. The two elements you want to consider are x+ay, and x+by^2. They are linearly dependent if there are real numbers such that p(x+ay)+q(x+by^2) is the zero function, the function that is identically zero for all x and y. you may do this in several ways: by letting x=0 y=1 and x=1 y=0 you can find relations amongst p q a and b that must be satified, and clearly nothing will satisfy them but p=q=0, ie they are linearly independent.
 
Thanks, matt grime. That was a little on the math-heavy side for me, but I think I get the idea. I'll have to explore "the polynomial ring" when I find the time. Thanks again to both of you.
 
linear independence for two objects is pretty easy. if neither is zero, it just means neither one is a multiple of the other.

What you use as multipliers determines what kind of linear independence you are using.

We usually use real numbers for multipliers so independence of x+ay and x+by^2, just means neither is a real multiple of the other.

that is obvious since multiplying by a real number cannot change a y into a y^2.

But the more interesting question you are concerned with is what does this say about the number of solutions?

Given two equations in two variables, the number of simultaneous solutions can be infinite even if they are independent in this sense.

For example x and xy are independent but share the whole y-axis as common solutions. These have a common factor of x, explaining that fact.

At least if we use complex numbers instead of reals we can say always that two polynomials share an infinite number of commono zeroes only if they have a common irreducible factor. (Then it seems to follow also for real numbers.)

the answer to the problem of how many common solutions two polynomials in two variables have, is called Bezout's theorem. I.e. if the two polynomials f,g have no common non constant factors, then they can have at most deg(f)deg(g) common solutions. Tangential solutions can count as more than one however, as usual.

there is a way to calculate the multiplicity of a solution (a,b), as the vector dimension of the local ring R/(f,g), where R is the ring formed from the polynomial ring C[x,y] by allowing as denominators all polynomials not vanishing at the given point (a,b).

I believe the first proof of this theorem was due to Gauss?, and used Euler's? theory of resultants.
 
Last edited:

Similar threads

  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 19 ·
Replies
19
Views
4K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 23 ·
Replies
23
Views
2K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 10 ·
Replies
10
Views
3K
  • · Replies 5 ·
Replies
5
Views
2K