- #1

Vodkacannon

- 40

- 0

Recall that the collatz conjecture says given any natural number n you must divide n by 2 if it is divisible by 2 and multiply n by 3 and add 1 if it is not divisible by 2. Repeat this process and you will always reach 1.

Well this program that I have created does exactly this: Determines if the amount of steps it takes to get to 1 is equal to the initial value used in the algorithm.

**It turns out that after checking millions of numbers, the only number that equals the # of steps to get to 1 is 5.**

I have conjecture that this is the only case possible, out of all natural numbers.

I also have an idea why. As n → ∞, # of steps → grows larger. (you can see this by looking at the program's output.)

But it does not always get larger. Some initial values have more steps that smaller initial values.

Still the general trend is that the steps required increases, thus the probability that n

_{steps}= n, approaches 0%.

**Please give me ideas, what else should I study about the Collatz Conjecture? What should I try to find out? Something more interesting.**