Elegant Proof of some summations

  • Thread starter Thread starter vsage
  • Start date Start date
  • Tags Tags
    Proof
AI Thread Summary
The discussion centers on finding elegant proofs for summation formulas, particularly for squares, cubes, and quartics from 1 to n. The original poster expresses frustration with existing proofs and seeks a more aesthetically pleasing approach. Participants share insights, including a proof involving the difference of cubes and a recursive method to derive formulas for sums of powers. They also note that the sum of the first n r'th powers can be expressed as a polynomial of degree r+1, which can be approached through induction or integration techniques. The conversation highlights the challenge of balancing elegance with mathematical rigor in proofs of summation formulas.
vsage
"Elegant" Proof of some summations

Ever since I was in the 8th grade sigma notation has been one of my favorite things to study and, more specifically, the things you can sum together. The proof was summing the numbers 1 to n is very simple and everyone knows the story about Gauss and that particular summation but I haven't seen a proof I like for sums of squares, cubics and quartics 1 to n (or any beyond that). I came up with the equations when I was a lot younger but I had no concept of a proof and I'm really stumped so, to get the ball rolling, can anyone point me to an "elegant" proof of the summation formula for squares 1^2 to n^2 or at least put me on the right wavelength of thought? The book I have gives a really dumb one that I don't like at all.
 
Mathematics news on Phys.org
There's always a proof by Induction, though I'm not sure many people would be willing to call that elegant.
 
Gokul43201 said:
There's always a proof by Induction, though I'm not sure many people would be willing to call that elegant.

What's not elegant about toppling dominoes? :-p
 
vsage, I was in Grade 8 when I 'discovered' a proof that 13 + 23 + 33 ... + n3 = (1 + 2 + 3 + ... + n)2.

My proof involved drawing and was not, in anyway, algebraic. Maybe you would like to give this proof a try. :smile:
 
Last edited:
vsage,

Your sum is over, say, j from 1 to n of j^2. Observe that
\left( j+1 \right)^3 - j^3 = 3j^2 + 3 j + 1
Now sum over j on both sides. When you sum the left side only \left(n+1\right)^3 -1 remains (everything else cancels!). The first term on the right is 3 times the desired sum and the remaining sums on the right side are elementary. You can easily obtain the desired result from there.
 
In general the sum of the first n r'th powers is a polynomial in n of degree r+1, and can in theory be found, but it isn't very elegant at all.
 
Tide's argument generalizes to a recursive way to find all such formulas.

Matt's remark that the formulas have degree r+1 also gives a way to find them by writing down a general equation of thsatd egree and trying r+2 cases, then solving for the coefficients.

It is at least easy to prove by inductiion that the formula not only has degree r+1 but satrts out n^(r+1)/(r+1), which is all you need to integrate all powers of x, by taking limits of riemann sums.
 
recon said:
Maybe you would like to give this proof a try. :smile:
You really constructed the square whose edge's length is the sum, from all the smaller cubes, is that what you are talking about ? :smile:
 
still i find it a little difficult to see how you got much past n=3 or 4 without using some algebra.
 
  • #10
The recursive method was the way listed in the book. Probably should have mentioned that. I also noticed way back when that the formula of the sum of j^(p+1) from 1 to n was the integral of j^p from 1 to n but differing by something to the effect of p! / (p-2)! (not exactly). Is there a simple way to give a logical basis for this?
 

Similar threads

Back
Top