Several unknowns but only one equation

  • Context: Graduate 
  • Thread starter Thread starter stabu
  • Start date Start date
  • Tags Tags
    Unknowns
Click For Summary

Discussion Overview

The discussion revolves around the relationship between the number of equations and unknown variables, particularly in the context of perturbation theory and polynomial expansions. Participants explore the implications of deriving multiple equations from a single equation through techniques like Taylor series expansion.

Discussion Character

  • Exploratory
  • Debate/contested
  • Mathematical reasoning

Main Points Raised

  • One participant notes the common rule that states one needs as many equations as unknowns, but questions its applicability in cases like perturbation expansions where multiple variables can be derived from a single equation.
  • Another participant points out that the rule applies specifically to nonsingular systems of linear equations and provides an example where one equation can yield a unique solution for multiple unknowns.
  • A third participant corrects the terminology used, suggesting that "independent" is a more accurate term than "nonsingular" in this context.
  • A later reply emphasizes that if a polynomial is equal to zero for all values of x, it implies an infinite number of equations, one for each coefficient, rather than just one equation.

Areas of Agreement / Disagreement

Participants express differing views on the applicability of the rule regarding equations and unknowns. There is no consensus on whether the initial participant's interpretation of deriving multiple equations from a single equation is valid, as some challenge this perspective while others provide clarifications.

Contextual Notes

Participants highlight limitations in the initial understanding of the relationship between equations and unknowns, particularly in distinguishing between types of equations and their implications in various mathematical contexts.

stabu
Messages
26
Reaction score
0
I have a query please, if anybody can shed some light, thanks:

So from an early age we get this idea of needing one equation for each unknown variable whose unique value we need to discover. Three unknowns? well, need three nonsingular equations: it's a kind of a rule of thumb I guess.

However, I have noticed that in some areas, notably perturbation expansions, and others, one arrives at a single equation, and can actually discover not one, but several variables form the one equation. Smells of a free lunch, eh? Of course, it can't happen just like that, In perturbation theory, a very common step is to expand a Taylor poly, rearrange into coefficients of rising powers of the key variable (say x), and then say (drums rolling ...) that if the expression equates to zero, then we can say that each coefficient (with its own unknowns) also equates to zero.

This enables us to pull out three, four, even more equations from the original expansion. Golly.

I've been over a few textbooks on this .. and they seem to treat it as a normal course of deduction. No flicker of the eyelids!

I admit this is a rough description ... I'll try and get more flesh on it later. Initially however, I wanted to post up about it ... to see if anybody recognises what I'm describing.

Thanks!
 
Mathematics news on Phys.org
As you said yourself the rule only applies to a nonsingular system of linear equations, not to arbitrary systems of equations (this is one of the most basic facts in linear algebra). Consider, a^2 + b^2 = 0 with a,b unknown real variables. We have two unknowns, one equation and one unique solution (0,0). I don't really see the problem. Some people may state "you need n equations to determine n unknowns" but they either implicitly assume equations to mean nonsingular linear equations or they are stating something that is often false, but true in many simple cases. I don't really see what you have trouble understanding. You know that this only applies in a special case, and you can come up with cases where it doesn't apply.
 
"Nonsingular" is the wrong word here. You mean "independent" equations.
 
stabu said:
I have a query please, if anybody can shed some light, thanks:

So from an early age we get this idea of needing one equation for each unknown variable whose unique value we need to discover. Three unknowns? well, need three nonsingular equations: it's a kind of a rule of thumb I guess.

However, I have noticed that in some areas, notably perturbation expansions, and others, one arrives at a single equation, and can actually discover not one, but several variables form the one equation. Smells of a free lunch, eh? Of course, it can't happen just like that, In perturbation theory, a very common step is to expand a Taylor poly, rearrange into coefficients of rising powers of the key variable (say x), and then say (drums rolling ...) that if the expression equates to zero, then we can say that each coefficient (with its own unknowns) also equates to zero.
Yes, it is certainly true that if a polynomial (or power series) is equal to 0 for all x[/itex] then every coefficient must be 0. That's not one equation, that is an infinite number of equations- one for each value of x.

This enables us to pull out three, four, even more equations from the original expansion. Golly.

I've been over a few textbooks on this .. and they seem to treat it as a normal course of deduction. No flicker of the eyelids!

I admit this is a rough description ... I'll try and get more flesh on it later. Initially however, I wanted to post up about it ... to see if anybody recognises what I'm describing.

Thanks!
 

Similar threads

Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
4K
  • · Replies 3 ·
Replies
3
Views
2K
Replies
2
Views
1K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 18 ·
Replies
18
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K
Replies
7
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K
Replies
15
Views
3K