# Challenge: can you take the sqrt (n) using only one operation

logics
The best known algorithm ('Babylonian') to take the square root of a number requires 3 operations, less than Newton's. I suppose it's also the fastest. Is it so?
Do you know a simpler or faster one?

Can you find a method that requires only 1 operation?

Last edited:

Staff Emeritus
Gold Member
Yes: apply the square root operation.

logics
... only 1 *operation...
square root is the problem to be solved, can it also be the solution?

P.S.: I apologize if I have been ambiguous: *(one of the 4 operations. +, -, x, : ). The algorithm should contain 1 op, but it can be iterated a few times, of course!
to simplify discussion and use a pocket calculator, let's choose (n = x²) n with less than 21 digits:
210.3456789², 987654.3012²... 597482²....etc

Last edited:
dodo
Yes; if you already know the solution to be, say, r, then you can find said solution by just one division, n/r.

Last edited:
Staff Emeritus
Gold Member
square root is the problem to be solved, can it also be the solution?

P.S.: I apologize if I have been ambiguous: *(one of the 4 operations. +, -, x, : ). The algorithm should contain 1 op, but it can be iterated a few times, of course!
to simplify discussion and use a pocket calculator, let's choose (n = x²) n with less than 21 digits:
210.3456789², 987654.3012²... 597482²....etc

So, we get to use "limit" as an operation too? What about "<" as well?

I'm not just being snarky here -- it is provably impossible to obtain $\sqrt{2}$ using just those four operations along with any integers you want, so "limit" or some other related concept is rather important.

Many algorithms for extracting roots rely on being able to tell whether one number is bigger than another, and so "<" is important.

logics
So, we get to use "limit" as an operation too? What about "<" as well?
-- it is provably impossible to obtain $\sqrt{2}$ using just those four operations....

We are only trying to solve a practical problem: we are trying to apply our contemporary wisdom to see we if can do any better than the Babylonian method, if that is really the best algorithm. (I'll tackle this in a separate post). Our solution should be more convenient even using pencil and paper. Do you think it is possible?

If n is not a square number ($\sqrt{2}$...$\sqrt{125348}$... etc), we have to stop sometime, somewhere, maybe after 1010 digits, but we must stop. Hope you agree. If a simple, convenient method requires '>','<' ,... or other, it is interesting to see it.
I was only responding to your (jocular?) post #2, excluding the obvious sqrt, log, ln, etc. I hope that is clear.
here: they decided to stop after 6 digits: 354.045, and it took them 5 rounds (iterations) = 15 (5*3 [+, :, :]) ops. to reach that result. Can you do any better?
I suggested to stop at 10 places (9digits plus the point) so that non specialists (without a computer program), like myself, can follow or take part in the discussion with just a pocket calculator.

If this is not clear I'll exemplify: my calculator says $\sqrt{125348}$ = 354.0451949 , the number is obviously truncated and I cannot know if it is rounded up or down and if it is correct to stop at 354.0451948.
That is why I suggested to start with a number with finite number of decimals ( like $\sqrt{125347.862}$) to know that exactly 354.045 is our goal.

P.S. Moreover, if you start with a finite n you can find its root even when many digits are missing: $\sqrt{623,470.122 xxx xxx x49}$, we can find the root: x = 789.601243

Last edited:
DonAntonio
We are only trying to solve a practical problem: we are trying to apply our contemporary wisdom to see we if can do any better than the Babylonian method, if that is really the best algorithm. (I'll tackle this in a separate post). Our solution should be more convenient even using pencil and paper. Do you think it is possible?

If n is not a square number ($\sqrt{2}$...$\sqrt{125348}$... etc), we have to stop sometime, somewhere, maybe after 1010 digits, but we must stop. Hope you agree. If a simple, convenient method requires '>','<' ,... or other, it is interesting to see it.
I was only responding to your (jocular?) post #2, excluding the obvious sqrt, log, ln, etc. I hope that is clear.
here they decided to stop after 6 digits: 354.045, and it took them 5 rounds (iterations) = 15 (5*3 [+, :, :]) ops. to reach that result. Can you do any better?
I suggested to stop at 10 places (9digits plus the point) so that non specialists (without a computer program), like myself, can follow or take part in the discussion with just a pocket calculator.

If this is not clear I'll exemplify: my calculator says $\sqrt{125348}$ = 354.0451949 , the number is obviously truncated and I cannot know if it is rounded up or down and if it is correct to stop at 354.0451948.
That is why I suggested to start with a number with finite number of decimals ( like $\sqrt{125347.862}$) to know that exactly 354.045 is our goal.

P.S. Moreover, if you start with a finite n you can find its root even when many digits are missing: $\sqrt{623,470.122 xxx xxx x49}$, we can find the root: x = 789.601243

Your whole post is pretty confusing, making your nick paradoxical. You talked of calculating the square root of a number

using "only one operation"...Did you mean using one operation several times, did you mean to use one unique

operation once? It should be obvious that either possibility has a negative answer: no, it is not possible IN GENERAL to do that

with only one operation, whatever "one operation" means to you.

Now, it is possible to evaluate the square root of any number to any degree of wanted precision by elementary operations,

namely +, -, *, /, and I don't know if this is the babylonian method.

DonAntonio

logics
Your whole post is pretty confusing,....I don't know if this is the babylonian method.
If you do not know it, probably that is the reason why you could not understand my post. I gave you a link, scroll up a little:

In the OP (post #1), I noted that Babylonian algorithm requires 3 operations for each iteration, as surely by now you have found out. In the title, if you're referring to that, I could not possibly explain everything.

I'm asking if you can find a simpler one (algorithm), one which requires less then 3 ops at each iteration. One op., of course, is the limit.
EDIT: Such an algorithm is likely to be less powerful, to converge slower, but it would require less then 15 ops. (in the #Example quoted), I suppose anyone can do better than wiki . I'm sure you are able at least to make a first guess more accurate than x0= 600

If you run the algorithm only once then you take directly sqrt(n), and you reach your goal with one operation.( 'ne plus ultra' !).

Besides the OP, you probably missed also post #3:
...The algorithm should contain 1 op, but it can be iterated a few times, of course...
I hope it is clear, if it was not.

P.S. A nick is a nick, what's in a name? that which we call a rose....

Last edited:
DonAntonio
If you do not know it, probably that is the reason why you could not understand my post. I gave you a link, scroll up a little:

In the OP (post #1), I noted that Babylonian algorithm requires 3 operations for each iteration, as surely by now you have found out. In the title, if you're referring to that, I could not possibly explain everything.

I'm asking if you can find a simpler one (algorithm), one which requires less then 3 ops at each iteration. One op., of course, is the limit. If you run the algorithm only once then you take directly sqrt(n), and you reach your goal with one operation.( 'ne plus ultra' !).

Besides the OP, you probably missed also post #3:

I hope it is clear, if it was not.

P.S. A nick is a nick, what's in a name? that which we call a rose....

I read only what you wrote in your first post, which is exactly the following:

"The best known algorithm ('Babylonian') to take the square root of a number requires 3 operations, less than Newton's. I suppose it's also

the fastest. Is it so? Do you know a simpler or faster one?

Can you find a method that requires only 1 operation?"

No iterations, no links, nothing. That's what confused me and, judging by the first responses at least, apparently also others.

DonAntonio

Gold Member
The algorithm should contain 1 op, but it can be iterated a few times, of course!

No such algorithm exists. Fix the number 2 for example and try to approximate the √2 using only one operation from {+,-,×,/}. If we choose either + or -, then we can either add or subtract 2 (or multiples of 2) from itself repeatedly, so the result will be 2n where n an integer. If we choose either × or /, then we can either multiply or divide 2 (or powers of two) from itself repeatedly, so the result will be 2n where n an integer. But neither 2n nor 2n can approximate √2 to more than say one or two decimal places.

logics
No such algorithm exists..
Thanks jgens, that gives me a chance to compare methods and check if it is my terminology that generates misunderstanding:

here is: B method, is this procedure an algorithm? (If it cannod described as an algorithm, I stand corrected and I apologize).
This method requires 3 operations:

(x + a : x) : 2
...1....2....3
As wiki shows, the formula must be iterated 5 times and that adds up to 15 ops. (to find: sqrt{125348} = 354.045, 6-digit accuracy)

There is another method : N[ewton]. It is equivalent to B, but (because of the way it is usually formulated), it requires 4 operations:
(x * x + a) : 2 * x
.. 1...2.....3....4
(You can find out how many ops. are necessary to solve the problem, if N it is better than B)

Now, why should all the objections apply only to a hypothetical method A that requires 2 operations?.
If I am missing something and your objections are valid only for my hypothesis, please change n to 125347.862025 or any other n which you deem suitable, and let's change the title:
Can you find the square root of 125347.862025 (x= 354.045) with less than 15 operations?

Thanks for your help, jgens, I hope you'll give us the first concrete contribution: if you were in Alexandria or Miletus, or in the Middle Ages with quill and parchment , how would you find 354.045?

Last edited:
DonAntonio
Thanks jgens, that gives me a chance to compare methods and check if it is my terminology that generates misunderstanding:

here is: B method, is this procedure an algorithm? (If it cannod described as an algorithm, I stand corrected and I apologize).
This method requires 3 operations:

(x + a : x) : 2
...1....2....3
As wiki shows, the formula must be iterated 5 times and that adds up to 15 ops. (to find: sqrt{125348} = 354.045, 6-digit accuracy)

There is another method : N[ewton]. It is equivalent to B, but (because of the way it is usually formulated), it requires 4 operations:
(x * x + a) : 2 * x
.. 1...2.....3....4
(You can find out how many ops. are necessary to solve the problem, if N it is better than B)

Now, why should all the objections apply only to a hypothetical method A that requires 2 operations?.
If I am missing something and your objections are valid only for my hypothesis, please change n to 125347.862025 or any other n which you deem suitable, and let's change the title:
Can you find the square root of 125347.862025 (x= 354.045) with less than 15 operations?

Thanks for your help, jgens, I hope you'll give us the first concrete contribution: if you were in Alexandria, or in the Middle Ages with quill and parchment , how would you find 354.045?

Using the system my dad taught me and which he learnt when he was a kid, either from my grandpa or at school:

subdive the given number in groups of 2, from right to left, before and after the decimal point:

12´53´47.86´20´25

The above 2-groups will be the ones "going down" after each step is finished.

Begin with 12 at left: take the number whose square is closeset to 12 but less than or equal to it. In this case

it is 3 and we write down aside.

Square it and substract it from 12: we get 3 below 12 and we draw down the next pair, 53, thus getting 353.

Double 3, getting 6 (of course, this is just the double method in some disguise which makes, perhaps, easier to carry on by hand)

Find a digit x such that $6x\cdot x$ is as close as possible to 353 but no more than it: this digit is 5 and we

write down aside at the right of the 3, so 65*5 = 325, and substract this from 352, getting 28, and now draw down the next pair 47...etc.

As you can see, we need two operations for each digit in the answer: doubling the already found number and then multiplying, except

for the first digit which only required squaring it.

So in this case we need 12 - 1 = 11 operations.

DonAntonio

Pd Hmmm...perhaps the substractions must be counted also and thus I have more operations. A pity...

Staff Emeritus
Gold Member
Division by 2 is barely an operation. General division is more expensive than general multiplication, and much more than plain addition.

I checked GMP source code -- for small square roots ($N \in [0, 2^{64})$), they use start off with a table lookup (for very small numbers), and then use Newton's algorithm to approximate y ~ 1/√N, which has the advantage of requiring no general divisions:
$$y \leftarrow (1/2) (N y^3 + y)$$
with the very last iteration mixing in an extra multiplication by N so the result is an approximation x ~ √N
$$x \leftarrow (1/2) (y (Ny)^2 + (Ny))$$

GMP is actually computing an integer square root with remainder. Once it has the approximation, it computes x^2, and then increments it along until it gets the exact value of $\lfloor \sqrt{N} \rfloor$ (this doesn't require general multiplication: $(x+1)^2 = x^2 + (2x+1)$), and then obtains the remainder.

I haven't worked out the exact details of what it's doing for larger square roots.

logics
...Division by 2 is barely an operation. General division is more expensive than general multiplication, and much more than plain addition....
That is true, but may lead to subjective evaluation: plain addition (n+n) is equivalent to division by 2 (n:2) but, arguably, easier. If we want to be more accurate we may count the micro-operations

... and then use Newton's algorithm to....
I'm not just being snarky here -- it is provably impossible to obtain $\sqrt{2}$ using just those four operations ....
Doesn't that apply to Newton's algorithm?

Gold Member
The sum of the first "n" odd numbers is n^2, conversely, when the sum matches the desired number it's the square root. To find the square root of 2 simply add pairs of zeros for the desired accuracy. I guess this is 2 addition operations and one compare.

logics
- it is provably impossible to obtain $\sqrt{2}$ using just those four operations along with any integers you want,...
No such algorithm exists. Fix the number 2 for example and try to approximate the √2 using only one operation from ....
Sorry, jgens, I did not reply directly to your post because I thought I was using the term 'algorithm' in a wrong way. You used Hurkyl's example, so I thought you were referring to his post, (which I still do not understand). Probably, as I said, I am missing something, could you tell me what?
The fact that Newton's algorithm (which uses only 3 different ops. +, x, :) is actually used (post #13) proves, I suppose a fortiori, that one can find a method using those four operations.

Yours is sound reasoning, but it is founded on the assumption that any method must use B's (or N) logics (the weighted mean).
You can't prove, by that, that no such algorithm exists.
I do not believe anyone can prove that such method is impossible. As you know, proving that something is impossible is a tough job.
(Which operation is suitable, seems obvious.)
Can you find the square root of 125347.862025 (x= 354.045) with less than 15 operations?
As I said in a previous edit, any schoolboy can do better than wiki and find the right x0, and that would mean only 4 iterations (that is 12 ops.)
What is the best 'first-guess' you can make?

Last edited:
phylotree
I don't know if this is flawed but a pretty good read to go on.

Given x, now you are asked to find √x. You then will know x's integral sup and inf values, supposing they are a, b then you will have $a ≤ \sqrt{x} ≤ b$. You can then use $c=\frac{a+b}{2}$ for each iteration to approximate the result
How many operations are needed depending upon how many decimal values you want to find .

Gold Member
Here is how one takes the square root of 2 by using addition, the square root of 2 00 is 14 sums of the odd numbers 1+3+5+7+9+11+13+15+17+19+21+23+25+ 27=196
square root of 2 00 00 1s 141 sums, 2 00 00 00 is 1414 sums, etc. Not to say it's efficient, but it will work with only addition.

logics
.... Not to say it's efficient, but it will work with only addition.
Conversely, one may start with n (=x²) and use only subtraction, if 215² = 46225:

46225 -1, -3, -5.....etc. It would take (x =) 215 operations, using this elementary logics.

But subtraction can be more efficient, as it allows to improve the basic, simple algorithm. Remember you have the right, following B (or N) scheme, ( see wiki-link above), to find a suitable x0, performing micro-operations in your mind. If you guess the first digit and the number of digits (4'62'25 $\rightarrow$ 2 | 0 | 0|), you may start subtracting 4 | 00 | 00 | : 46225 - 40000 = 6225, and then continue subtracting (200 x 2 +1) 401,...etc:

(n² - 40000 = ) 6225 - 401, - 403, - 405....etc. That would take only 15 + 1 operations

But, of course, it takes a more ingenious method, a more sophisticated logic to outwit Newton and Babylonian logics, weighted mean is simple but powerful.

Last edited:
Altrepair
I am no expert here at all, but is there a reason why no one posted this algorithm: http://www.homeschoolmath.net/teaching/sqr-algorithm-why-works.php on square roots?? It appears to use the least amount of operations that the OP was asking for. It was used back in the days of elementary school of the old days before the calculator was around.

Mentor
I was taught this algorithm back when I was in about the 7th grade or so (in the 50s).

DonAntonio
I am no expert here at all, but is there a reason why no one posted this algorithm: http://www.homeschoolmath.net/teaching/sqr-algorithm-why-works.php on square roots?? It appears to use the least amount of operations that the OP was asking for. It was used back in the days of elementary school of the old days before the calculator was around.

This is exactly the same algorithm I posted here. I think it is likely the easiest one to use with pencil and paper.

DonAntonio

Gold Member
I don't know if this is considered "Newton" a = x^2 +2x +1 then the square root of a = x+1
doing some manipulation we are left with x = (a - 1)/(x+2), seed x with something close or just use a, remember to add "1" for the result.

In 10 iterations a=2, 1.414213584 calculator = 1.414213562
in 13 iterations same as calculator

logics
I was taught this algorithm back when I was in about the 7th grade or so (in the 50s).

I, too, was taught the long-division (LD) method way back in '56 (); using only micro-operations, we had fun taking square/cube roots without pen and paper.

LD is surely better then N: that isn't much, since N can be catastrophic (266 ops.!), if you choose the wrong x0. The weak point in LD is that the number of operations (*Nt=o) depends on individual skills (and luck) as it includes "trial-and-error", (the great advantage is that it requires only 3/4-digit ops).

Mark44 , you are the greatest expert on algorithms, could you tell Altrepair if he can put that algorithm (LD) on a pocket calculator? What do you think of Newton's, could it be used in a more efficient way, skipping useless iterations?

* The OP appeared confusing to a hasty reader because N[umber of operations] means 4 different things:

a [square]root is found with Nt ops., iterating Ni times an algorithm that requires No ops. using N+ different op-signs. Nt = Ni * No
(in the OP): if No = 1 $\rightarrow$ (N+ = 1
, Nt = Ni), in the title: N+ =1 $\rightarrow$ No = 1

EDIT
: $\sqrt{354.045^2}$; (N) N+ = 3 , Nt = 20 (5*4); (B) N+ = 3, Nt = 15 (5*3); (LD, if Ni = 1 $\rightarrow$ Nt = No) N+ = 2, Nt=o = ??

Last edited:
Mentor
I, too, was taught the long-division (LD) method way back in '56 (); using only micro-operations, we had fun taking square/cube roots without pen and paper.

LD is surely better then N: that isn't much, since N can be catastrophic (266 ops.!) if you choose the wrong x0]. The weak point is that the number of operations*Nt=o depends on individual skills (and luck) as it includes "trial-and-error".

Mark44 , you are the greatest expert on algorithms, could you tell Altrepair if he can put that algorithm (LD) on a pocket calculator? What do you think of Newton's, could it be used in a more efficient way, skipping useless iterations?
I wouldn't consider myself to be an expert at algorithms, let alone the "greatest" expert. As far as programming a (programmable) calculator, I'm pretty sure it could be done. If you can write down an algorithm, someone can come along and implement the algorithm in some programming language.

* The OP appeared confusing to a hasty reader because N[umber of operations] means 4 different things:
a [square]root is found with Nt ops, iterating Ni times an algorithm that requires No operations and N+ different op-signs
If No = 1 $\rightarrow$ N+ = 1

If (LD) Ni = 1 $\rightarrow$ Nt = No

logics
As far as programming a (programmable) calculator, I'm pretty sure it could be done... and implement the algorithm in some programming language.
I'm not sure if that applies to my old 'programmable' (£10) Sharp EL-509, with no programming language. It has a 12²-place buffer where I can write formulas and ANS. I can put there N (B), [because it's a simple formula], changing x $\rightarrow$ ANS : {(ANS²+a) : 2ANS} , and press 5x Enter. But, is this an algorithm, just because of the feedback : ANS? I can't put LD in my calculator (how do I deal with 'trial-and-error'?), I'd like to learn how to do that, without a real algorithm. This problem has been bugging me, I wonder if anyone can kindly help, as:

I used that term in the OP only beacause I read it in other threads, but I was afraid it was not appropriate, as , AFAIK, N (or B) is just an iterative method, and an iterated formula requires no logical decisions such as : "if A, ...then GOTO; if B...."
The best known algorithm... requires 3 operations, ... Do you know a simpler or faster one? Can you find a method that requires only 1 operation?
1 operation: { k [:] m} can hardly be referred to as an 'algorithm'...(I called it, more modestly, a method)
What about "<" as well?....Many algorithms ...rely on being able to tell whether one number is bigger than another, and so "<" is important.
...which, as it was rightly pointed out, needs decisions like : if p > q, then....;.... if p< q... then STOP etc. But in post #13, again, N was referred to as Newton's algorithm, so, why N's can do without '> <' and mine can't?

Lastly, to explain the sense of my question ("could you tell Altrepair..."): LD is surely less expensive than N and B, especially if we count micro-operations, but is surely more cumbersome than both: my challenge was to find the most simple (and fastest), formula: LD is much more complex than N (B).
'Only one operation' may seem a tall order, but not only it is possible , it is not too difficult if you examine the structure of a square

Last edited:
Staff Emeritus
Gold Member
Many languages have conditional expressions in addition to conditional statements: e.g. a function satisfying:

$$\textrm{Choose}(P, x, y) = \begin{cases} x & P \\ y & \neg P \end{cases}$$

where P is a truth value. In other languages, conditions return numeric values; e.g. true is 1 and false is 0, but sometimes the opposite. You can implement absolute value, for example, by

$$|x| = x (1 - 2(x<0) )$$

Sometimes, various operations can be used as substitutes for tests. e.g. if you have the sign function available, you can use it in ways similar to what I just described when you just care about orderings:

$$|x| = x \mathop{\textrm{sign}}(x)$$

Staff Emeritus
Gold Member
Doesn't that apply to Newton's algorithm?
Yes. But at the time, I had decided to give you the benefit of the doubt and assume you really mean to ask about algorithms for producing approximations to square roots, but just didn't want to say it out loud.

Last edited:
logics
I had decided to ..... assume you really mean to ask about algorithms for producing approximations .
You'll greatly oblige me (and future wievers) , if you help me edit the OP, can you (or Mark44, or ... anyone) do it?

Last edited:
Mentor
Wievers - ?? What does that mean? It's not in my dictionary. Did you mean "viewers"?

Antiphon
It can be done in one step provided you specify the precision in advance.

You use the number as the index into an array containing the root. As long as the precision is specified, each real number is represented by an integer. Most modern computers will perform the indexed register load as an atomic operation (single step).

logics
Did you mean "viewers"?
I'm of courteous disposition and act my age, do you expect a reply?

I suggested: (edited OP)
The best known algorithm EDIT: i.e. iterative method ('Babylonian') to take the square root of a number requires 3 operations.....
Do you know a simpler or faster one?
Can you find a method that requires only 1 operation? (No = 1)

EDIT : a [square]root is found with Nt ops., iterating Ni times a formula that requires No ops. using N+ different op-signs. Nt = Ni * No.
Babylonian method
, $\sqrt{354.045^2}, ($ if x0 = 300.00) : N+=No = 3, Ni = 4 , 3 * 4 $\rightarrow$ Nt = 12 operations
if No = 1 $\rightarrow$ (N+ = 1, Nt = Ni); [in the title: N+ = 1 $\rightarrow$ No = 1 , Nt < 12] ;
Nt < 8 is a 'good' solution, (5 * 1) = 5 operations would be 'brilliant'.
Post #31 proves that new readers can't carefully examine all previous posts. An edit to the OP would probably help. Thanks.

Last edited: