Summing Weird Series: A Basic Understanding

  • Thread starter Thread starter DecayProduct
  • Start date Start date
  • Tags Tags
    Series Sums Weird
AI Thread Summary
The discussion focuses on understanding how to sum series where the ratios between terms change, particularly through the lens of integration. The user explores the function f(x) = √x and grapples with the concept of finding an antiderivative, ultimately applying the Fundamental Theorem of Calculus to compute the integral from 0 to 1. They successfully determine that the integral equals 2/3 by identifying the antiderivative as (2/3)x^(3/2). Questions arise about the process of finding antiderivatives, highlighting the complexity compared to derivatives. The conversation emphasizes the intuitive nature of integration as summing changes to find function values.
DecayProduct
Messages
67
Reaction score
0
I have a rudimentary understanding of integration as it applies to finding the area under a curve. I get the idea of adding up the areas of progressively smaller rectangles to approach the area, and that at an infinite number of rectangles the areas would be exactly the same. Right now I'm just playing around with the idea and I'm curious about how to sum up n number of things if the ratio between each one changes.

For example, I've drawn a graph of f(x) = \sqrt{x} between 0 and 1. Now this isn't like a geometric series where I can find the sum using S_{n}=a_{1}(1-r^{n})/1-r, because r changes. I have discovered that a_{n} = a_{1}\sqrt{n}. I have played around with the ratios and discovered some interesting patterns that emerge, and I have found a complicated way to sum up two objects, but it is really more work than just doing the sum directly, and there'd be no way to do it when n=\infty.

Sorry for such a basic question, but how are things like this summed?
 
Last edited:
Mathematics news on Phys.org
The idea of integration as the limit of Riemann sums can be used to determine what functions are integrable and can be used as a guide to setting up integrals in applications. But in fact, for all except the simplest examples, we use the "Fundamental Theorem of Calculus"-
\int_a^b f(x) dx= F(b)- F(a)[/itex]<br /> where F(x) is any function having f(x) as derivative- F is an &quot;anti-derivative&quot; of f. <br /> <br /> In this particular case, to find <br /> \int_0^1\sqrt{x}dx= \int x^{\frac{1}{2}}dx<br /> I would note that the derivative of (2/3)x^{3/2} is (2/3)(3/2)x^{3/2- 1}= x^{1/2} so I can take f(x)= x^{1/2} and F(x)= (2/3)x^{3/2}.<br /> \int_0^1\sqrt{x}dx= (2/3)(1^{3/2}- 0^{3/2})= 2/3[/itex]
 
HallsofIvy said:
In this particular case, to find
\int_0^1\sqrt{x}dx= \int x^{\frac{1}{2}}dx
I would note that the derivative of (2/3)x^{3/2} is (2/3)(3/2)x^{3/2- 1}= x^{1/2}

Thanks for the response. I sort of understand it, except for where the derivative of (2/3)x^{3/2} comes from. I mean, is it just something that one has to work out until you find an antiderivative that equals \sqrt{x}, or is there some more basic and natural way that this comes about?
 
It is exactly for this reason that finding a primitive is a lot harder in general than finding a derivative. However in this case you can just use the power rule for integrating polynomials.

<br /> \int x^n dx=\frac{1}{1+n} x^{n+1},\;\;n \neq -1<br />

In word, raise the power of your integrand by 1 then divide through the new power.
 
Last edited:
DecayProduct said:
Thanks for the response. I sort of understand it, except for where the derivative of (2/3)x^{3/2} comes from. I mean, is it just something that one has to work out until you find an antiderivative that equals \sqrt{x}, or is there some more basic and natural way that this comes about?

I guess one way you could think of it is that if you had a derivative function which measures all the changes between two points, adding up all the changes between these points gives the normal function value. In this case the derivative function is just that and the anti derivative is the function value. Its kind of intuitive if you think of it in that way and then calculus just has to make sense.
 
Seemingly by some mathematical coincidence, a hexagon of sides 2,2,7,7, 11, and 11 can be inscribed in a circle of radius 7. The other day I saw a math problem on line, which they said came from a Polish Olympiad, where you compute the length x of the 3rd side which is the same as the radius, so that the sides of length 2,x, and 11 are inscribed on the arc of a semi-circle. The law of cosines applied twice gives the answer for x of exactly 7, but the arithmetic is so complex that the...
Thread 'Unit Circle Double Angle Derivations'
Here I made a terrible mistake of assuming this to be an equilateral triangle and set 2sinx=1 => x=pi/6. Although this did derive the double angle formulas it also led into a terrible mess trying to find all the combinations of sides. I must have been tired and just assumed 6x=180 and 2sinx=1. By that time, I was so mindset that I nearly scolded a person for even saying 90-x. I wonder if this is a case of biased observation that seeks to dis credit me like Jesus of Nazareth since in reality...
Fermat's Last Theorem has long been one of the most famous mathematical problems, and is now one of the most famous theorems. It simply states that the equation $$ a^n+b^n=c^n $$ has no solutions with positive integers if ##n>2.## It was named after Pierre de Fermat (1607-1665). The problem itself stems from the book Arithmetica by Diophantus of Alexandria. It gained popularity because Fermat noted in his copy "Cubum autem in duos cubos, aut quadratoquadratum in duos quadratoquadratos, et...

Similar threads

Back
Top