# Is 1-1+1-1+1.... = (1/2) ?

• B
cmb
TL;DR Summary
Is 1-1+1-1+1... = (1/2) , if not where does this calculation go wrong?
Is 1-1+1-1+1... = (1/2) , if not where does this calculation go wrong?

Take
S = 1-1+1-1+1-1...

and therefore
S = 0+1-1+1-1+1...

Line up the first 1 with 0 and sum, 1+0=1 and the rest cancel out.
S = 1-1+1-1+1-1...
S = 0+1-1+1-1+1...

So 2S = 1
S = 1/2

But that's wrong. S can only be 1 or 0?

Homework Helper
Gold Member
2021 Award
You've proved, using some properties of limits, that IF that infinite series converges, then it must converge to ##1/2##. What you haven't proved is that series converges in the first place. And, in fact, using the definition of the sum of an infinite series, you can show that it does not converge.

russ_watters, hutchphd and BvU
cmb
You've proved, using some properties of limits, that IF that infinite series converges, then it must converge to ##1/2##. What you haven't proved is that series converges in the first place. And, in fact, using the definition of the sum of an infinite series, you can show that it does not converge.
Why does it need to converge for the logic of summing those infinite series?

Homework Helper
Gold Member
2021 Award
Why does it need to converge for the logic of summing those infinite series?
It doesn't need to converge for the logic to hold. But, it does mean that your logic has led to a vacuous truth.

russ_watters
cmb
It doesn't need to converge for the logic to hold. But, it does mean that your logic has led to a vacuous truth.
But, why?

Where is the logic flawed?

Homework Helper
Gold Member
2021 Award
Where is the logic flawed?
The logic is not flawed. Logic is deducing one thing from another. This is the basis of proof by contradiction. You assume what you suspect is a falsehood, apply valid logic and reach something you know to be false. Therefore, the original assumption must be false.

In this case, if you already know that the series does not sum to ##1/2##, then the tacit assumption that the series has a limit leads to that falsehood. It is, therefore, the original assumption that is false - not that the logic must be flawed.

BvU
cmb
The logic is not flawed. Logic is deducing one thing from another. This is the basis of proof by contradiction. You assume what you suspect is a falsehood, apply valid logic and reach something you know to be false. Therefore, the original assumption must be false.

In this case, if you already know that the series does not sum to ##1/2##, then the tacit assumption that the series has a limit leads to that falsehood. It is, therefore, the original assumption that is false - not that the logic must be flawed.
In no way did I assume the series has a limit, but that every member of the series can cancel out.

A 'series with a limit' is a conventional phrase, not a piece of scientific logic based in abstraction.

The logic here is that if you line up two infinite series, off-set them, and cancel each of one member from the other, then logic gives the total sum of the two series together, cancelling each other out off to infinity.

It is a principle used all the time, so either it is OK to do that, or it is not OK to do that at all.

My question is; why is the logic flawed for this series and not for others? I can accept as an assertion that a series must converge for this logic to hold, but my question is why? Making an assertion and dismissing anyone who asks for some logic as to why the assertion is correct, isn't good science.

weirdoguy and PeroK
Staff Emeritus
Gold Member
2021 Award
Basically the point is as soon as you write let S=... You have made a mistake. The thing in the right hand side doesn't equal anything, it's a series of symbols that you did random manipulations with.

In theory it's no different than saying something like sin(x)/(n) = six by cancelling the numerator and denominator, other than the mistake being a lot less obvious.

A large part of calculus/analysis is taking infinite sums of numbers, and deciding if they actually end up equaling a number or not. There are a lot of things you need to do carefully, even if a sum does end up equaling a number, if you rearrange the terms of the sum then you can get a totally different number.

PeroK
cmb
Basically the point is as soon as you write let S=... You have made a mistake. The thing in the right hand side doesn't equal anything, it's a series of symbols that you did random manipulations with.

In theory it's no different than saying something like sin(x)/(n) = six by cancelling the numerator and denominator, other than the mistake being a lot less obvious.

A large part of calculus/analysis is taking infinite sums of numbers, and deciding if they actually end up equaling a number or not. There are a lot of things you need to do carefully, even if a sum does end up equaling a number, if you rearrange the terms of the sum then you can get a totally different number.
Fine that you say that. But my question is still not answered. How can you tell it is wrong here, but not in other examples?

Homework Helper
Gold Member
2021 Award
In no way did I assume the series has a limit,
Yes you did. You assumed it summed to some number ##S##.

Take
S = 1-1+1-1+1-1...
That assumption is false. There is no such ##S##.

Homework Helper
Gold Member
2021 Award
But my question is still not answered.
Sorry I wasted my time.

Gold Member
Formally, a series converges if the sequence of partial sums converges. For this series: (1,0,1,0…) which does not converge.

cmb
Yes you did. You assumed it summed to some number ##S##.

That assumption is false. There is no such ##S##.
No. "S =" means S is 'equivalent to something'.

Could be a set or range or something undefined, or something else.

But the logic of cancelling the terms of the infinite sequence then suggest it is a number. If the answer had come out 'undefined' then indeed the maths logic would have been clear it was not a number. But the cancellation of the set of all those values did give a number.

Like if I wrote "S=1/0", I did not pose there at all that S was a number, and in fact the RHS is undefined therefore S is undefined. At no point did I make an assumption about the nature of what S was.

So how come this sequence of cancellations can be dismissed, but others are acceptable?

weirdoguy
equivalent to something

And that something has to exist, and your S does not. The sum you wrote does not exist.

but others are acceptable?

They are if and only if the series converge. That is one of the basic theorems on series. You can manipulate the summands if the series converge. If it does not, you can't.

cmb
And that something has to exist, and your S does not. The sum you wrote does not exist.
and I'm asking what the mathematical explanation for that is?
They are if and only if the series converge. That is one of the basic theorems on series. You can manipulate the summands if the series converge. If it does not, you can't.
Right, so I just asked above what that theorem is that proves this is only possible with a converging series. I cannot find anything that explicitly proves this proposition. I'm asking for your help to either link me to it or put it forward here.

weirdoguy
and I'm asking what the mathematical explanation for that is?

Why that series does not converge? Because the limit of ##(-1)^n## does not exist. Necessary condition for ##\sum a_n## to converge is that ##\lim a_n=0##.

I cannot find anything that explicitly proves this proposition.

Well, I guess others may help with this one, because I learned that theorem like 12 years ago, and I don't remember the proof

BvU
Homework Helper
Gold Member
2021 Award
and I'm asking what the mathematical explanation for that is?
We all know what you are asking and we all agree on the answer - which you have been given several times now. It's your refusal to accept that answer that confounds us.

Gold Member
Formally, a series converges if the sequence of partial sums converges. For this series: (1,0,1,0…) which does not converge.
The operations that you apply to get the limit can only be proven to work if the series is convergent using the above definition.

That being said, what you are doing essentially falls under cesaro summation
https://en.wikipedia.org/wiki/Cesàro_summation

Gold Member
The sum you wrote does not exist.
and I'm asking what the mathematical explanation for that is?
S can only be 1 or 0
There is nothing that exists in mathematics that is both 1 and 0.

Mentor
Take
S = 1-1+1-1+1-1...

"S =" means S is 'equivalent to something'.
Could be a set or range or something undefined, or something else.
No. In the equation in the first quote, you are saying that S is a number that is equal to the expression on the right side. The expression on the right side is NOT a set, or range, or something undefined.

pbuk
Mentor
We all know what you are asking and we all agree on the answer - which you have been given several times now.
@cmb, as @PeroK notes, the question you asked has been answered multiple times, so I'm closing this thread.

pbuk, BvU and PeroK