1. Limited time only! Sign up for a free 30min personal tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Several unknowns but only one equation

  1. Apr 24, 2009 #1
    I have a query please, if anybody can shed some light, thanks:

    So from an early age we get this idea of needing one equation for each unknown variable whose unique value we need to discover. Three unknowns? well, need three nonsingular equations: it's a kind of a rule of thumb I guess.

    However, I have noticed that in some areas, notably perturbation expansions, and others, one arrives at a single equation, and can actually discover not one, but several variables form the one equation. Smells of a free lunch, eh? Of course, it can't happen just like that, In perturbation theory, a very common step is to expand a Taylor poly, rearrange into coefficients of rising powers of the key variable (say x), and then say (drums rolling ...) that if the expression equates to zero, then we can say that each coefficient (with its own unknowns) also equates to zero.

    This enables us to pull out three, four, even more equations from the original expansion. Golly.

    I've been over a few textbooks on this .. and they seem to treat it as a normal course of deduction. No flicker of the eyelids!

    I admit this is a rough description ... I'll try and get more flesh on it later. Initially however, I wanted to post up about it ... to see if anybody recognises what I'm describing.

  2. jcsd
  3. Apr 24, 2009 #2
    As you said yourself the rule only applies to a nonsingular system of linear equations, not to arbitrary systems of equations (this is one of the most basic facts in linear algebra). Consider, a^2 + b^2 = 0 with a,b unknown real variables. We have two unknowns, one equation and one unique solution (0,0). I don't really see the problem. Some people may state "you need n equations to determine n unknowns" but they either implicitly assume equations to mean nonsingular linear equations or they are stating something that is often false, but true in many simple cases. I don't really see what you have trouble understanding. You know that this only applies in a special case, and you can come up with cases where it doesn't apply.
  4. Apr 24, 2009 #3


    User Avatar
    Science Advisor

    "Nonsingular" is the wrong word here. You mean "independent" equations.
  5. Apr 24, 2009 #4


    User Avatar
    Science Advisor

    Yes, it is certainly true that if a polynomial (or power series) is equal to 0 for all x[/itex] then every coefficient must be 0. That's not one equation, that is an infinite number of equations- one for each value of x.

Share this great discussion with others via Reddit, Google+, Twitter, or Facebook