Several unknowns but only one equation

  • Context: Graduate 
  • Thread starter Thread starter stabu
  • Start date Start date
  • Tags Tags
    Unknowns
Click For Summary
SUMMARY

This discussion centers on the concept of deriving multiple unknowns from a single equation, particularly in the context of perturbation expansions and Taylor series. Participants highlight that while the traditional rule states that one equation is needed for each unknown variable, perturbation theory allows for a single polynomial equation to yield multiple equations through the expansion of coefficients. The conversation emphasizes the distinction between nonsingular and independent equations, clarifying that a polynomial equating to zero for all values implies an infinite number of equations, not just one. This nuanced understanding challenges conventional linear algebra principles.

PREREQUISITES
  • Understanding of perturbation theory
  • Familiarity with Taylor series expansions
  • Basic knowledge of linear algebra concepts, particularly independent equations
  • Experience with polynomial equations and their properties
NEXT STEPS
  • Study the principles of perturbation theory in detail
  • Learn how to perform Taylor series expansions and their applications
  • Explore the concept of independent versus dependent equations in linear algebra
  • Investigate the implications of polynomial equations equating to zero for all values
USEFUL FOR

Mathematicians, physicists, and students of applied mathematics who are interested in advanced equation solving techniques and the applications of perturbation theory in various fields.

stabu
Messages
26
Reaction score
0
I have a query please, if anybody can shed some light, thanks:

So from an early age we get this idea of needing one equation for each unknown variable whose unique value we need to discover. Three unknowns? well, need three nonsingular equations: it's a kind of a rule of thumb I guess.

However, I have noticed that in some areas, notably perturbation expansions, and others, one arrives at a single equation, and can actually discover not one, but several variables form the one equation. Smells of a free lunch, eh? Of course, it can't happen just like that, In perturbation theory, a very common step is to expand a Taylor poly, rearrange into coefficients of rising powers of the key variable (say x), and then say (drums rolling ...) that if the expression equates to zero, then we can say that each coefficient (with its own unknowns) also equates to zero.

This enables us to pull out three, four, even more equations from the original expansion. Golly.

I've been over a few textbooks on this .. and they seem to treat it as a normal course of deduction. No flicker of the eyelids!

I admit this is a rough description ... I'll try and get more flesh on it later. Initially however, I wanted to post up about it ... to see if anybody recognises what I'm describing.

Thanks!
 
Mathematics news on Phys.org
As you said yourself the rule only applies to a nonsingular system of linear equations, not to arbitrary systems of equations (this is one of the most basic facts in linear algebra). Consider, a^2 + b^2 = 0 with a,b unknown real variables. We have two unknowns, one equation and one unique solution (0,0). I don't really see the problem. Some people may state "you need n equations to determine n unknowns" but they either implicitly assume equations to mean nonsingular linear equations or they are stating something that is often false, but true in many simple cases. I don't really see what you have trouble understanding. You know that this only applies in a special case, and you can come up with cases where it doesn't apply.
 
"Nonsingular" is the wrong word here. You mean "independent" equations.
 
stabu said:
I have a query please, if anybody can shed some light, thanks:

So from an early age we get this idea of needing one equation for each unknown variable whose unique value we need to discover. Three unknowns? well, need three nonsingular equations: it's a kind of a rule of thumb I guess.

However, I have noticed that in some areas, notably perturbation expansions, and others, one arrives at a single equation, and can actually discover not one, but several variables form the one equation. Smells of a free lunch, eh? Of course, it can't happen just like that, In perturbation theory, a very common step is to expand a Taylor poly, rearrange into coefficients of rising powers of the key variable (say x), and then say (drums rolling ...) that if the expression equates to zero, then we can say that each coefficient (with its own unknowns) also equates to zero.
Yes, it is certainly true that if a polynomial (or power series) is equal to 0 for all x[/itex] then every coefficient must be 0. That's not one equation, that is an infinite number of equations- one for each value of x.

This enables us to pull out three, four, even more equations from the original expansion. Golly.

I've been over a few textbooks on this .. and they seem to treat it as a normal course of deduction. No flicker of the eyelids!

I admit this is a rough description ... I'll try and get more flesh on it later. Initially however, I wanted to post up about it ... to see if anybody recognises what I'm describing.

Thanks!
 

Similar threads

Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 5 ·
Replies
5
Views
4K
Replies
5
Views
3K
  • · Replies 4 ·
Replies
4
Views
2K
Replies
15
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 3 ·
Replies
3
Views
4K
  • · Replies 2 ·
Replies
2
Views
1K