Why the Fundamental Theorem of Algebra & Calculus are Called "Fundamental

  • Thread starter Thread starter trx123
  • Start date Start date
  • Tags Tags
    Fundamental
trx123
Messages
6
Reaction score
0
Can anyone explain why the Fundamental Theorem of Algebra and the Fundamental Theorem of Calculus are called "Fundamental"?
The algebra theorem states that every non-constant single-variable polynomial with complex coefficients has at least one complex root.
The calculus theorem states that an indefinite integration can be reversed by a differentiation and that a definite integral of a function can be computed by using anyone of its infinitely many anti-derivatives.
According to Wikipedia, the Fundamental Theorem of Algebra is not fundamental for modern algebra; its name was given at a time in which algebra was basically about solving polynomial equations with real or complex coefficients. In any case, what's is fundamental about these theorems? This is a word useage question more than a math question.
 
Mathematics news on Phys.org
Well I think the Fundamental Theorem of Algebra is not fundamental to modern algebra since polynomials are more or less an elementary topic. Anyways, elementary algebra is all about solving equations. Linear equations and quadratic equations are introduced first, but these are only the tip of the iceberg. Polynomials are much more general, and finding solutions to polynomial equations are not always straightforward. There are many techniques, such as the factor theorem, root theorem, synthetic division, intermediate value theorem, etc., but you don't always know which ones will work. So then a very natural and important question is to ask whether we could even find a solution to any polynomial equation placed before us, and the Fundamental Theorem of Algebra answers that. Thus, while the Fundamental Theorem of Algebra does not always lend itself to practical use, it is a very general statement that guarantees solutions to a large set of the algebraic equations one encounters in elementary mathematics.

The fundamental theorem of calculus is extremely practical. The definition of the integral encountered in a calculus course is not always easy to work with. Finding the area under x^2 without the fundamental theorem of calculus, for instance, takes a bit of algebra and is prone to computational mistakes. This is however, considered a relatively nontrivial result (especially when Archimedes first used this method awhile back), because you can't really provide a geometric proof. But after Newton and Leibniz discovered the Fundamental Theorem, such integrals became much simpler. After their discovery, one simply had to take a derivative, and if the resulting function is continuous, then one obtains a formula for the antiderivative. This of course lead to widely used methods of integration, e.g. integration by parts or substitution.
 
  • Like
Likes japplepie
Thanks for writing. What I get from your answer is that a theorem may be called "fundamental" if it is a very general statement. Two of Merriam Webster definitions of the word fundamental are "of central importance" and "dealing with general principles rather than practical application." It is a ubiquitous word in Physics. In her book "Warped Passages," Lisa Randall uses the word many times.
 
Fundamental means the theorem is regarded as one of the most important in the field.

In the case of the fundamental theorem of algebra, you have to understand that algebra and abstract ("modern") algebra are two distinct subjects.

You put a lot of detail into your wording of the theorems. To see why they are fundamental, it is helpful to give a rough idea of what they mean:

FT of calc - Integrals and derivatives are inverse operations.
FT of algebra - All nontrivial polynomial equations have solutions.
FT of arithmetic - integers have unique factorizations.
 
Insights auto threads is broken atm, so I'm manually creating these for new Insight articles. In Dirac’s Principles of Quantum Mechanics published in 1930 he introduced a “convenient notation” he referred to as a “delta function” which he treated as a continuum analog to the discrete Kronecker delta. The Kronecker delta is simply the indexed components of the identity operator in matrix algebra Source: https://www.physicsforums.com/insights/what-exactly-is-diracs-delta-function/ by...
Fermat's Last Theorem has long been one of the most famous mathematical problems, and is now one of the most famous theorems. It simply states that the equation $$ a^n+b^n=c^n $$ has no solutions with positive integers if ##n>2.## It was named after Pierre de Fermat (1607-1665). The problem itself stems from the book Arithmetica by Diophantus of Alexandria. It gained popularity because Fermat noted in his copy "Cubum autem in duos cubos, aut quadratoquadratum in duos quadratoquadratos, et...
I'm interested to know whether the equation $$1 = 2 - \frac{1}{2 - \frac{1}{2 - \cdots}}$$ is true or not. It can be shown easily that if the continued fraction converges, it cannot converge to anything else than 1. It seems that if the continued fraction converges, the convergence is very slow. The apparent slowness of the convergence makes it difficult to estimate the presence of true convergence numerically. At the moment I don't know whether this converges or not.

Similar threads

Back
Top