# Factoring algorithm time

1. Sep 20, 2007

### michael879

ok so I thought of this factoring algorithm, and I did some quick analysis on it (attached picture, time is in seconds * 5). It appears to be polynomial. However, when I see the basic brute force factoring algorithm I think thats also polynomial. So before I get excited over nothing. Can someone explain to me why the brute force factoring algorithm shown below is exponential time? What I get is approximately O(n^m) where n is the number and m is the number of factors.

Code (Text):
brute-force(n)
{
for(i = 2; i < n/2; i++)
{
if(n%i == 0)
return CONCAT(brute_force(i), brute_force(n/i));
}
return n;
}

basically it searches till it finds a factor, and then it returns a list of the concatination of the factors of the two factors it found (i and n/i). For prime numbers it just returns the number. The main loop is obviously O(n). And as far as I can tell the function is recursively called on the order of m where m is the number of factors. This gives O(n^m).

2. Sep 20, 2007

### michael879

attachment got messed up here it is.

#### Attached Files:

• ###### factor.JPG
File size:
67 KB
Views:
139
3. Sep 20, 2007

### CRGreathouse

That program has running time roughly proportional to the largest factor of the number, which in the worst case is the whole number if it's prime. (If you check for primality first that reduces it massively to the square root of the number in the worst case.)

However much faster algorithms are known. The ideal algorithm, which is probably impossible*, would run in polynomial time. But "polynomial time" in this case means polynomial in the size of the input.

If you have a b-bit number n, you have $2^{b-1}\le n<2^b$. Your algorithm then has run time $\mathcal{O}(2^b)$ (or $\mathcal{O}(\sqrt2\,^b)$ if you check for primes), while there are algorithms that are $\mathcal{O}((1+\epsilon)^b)$ for all $\epsilon>0$ -- that is, they are subexponential.

* This has not been proven.

4. Sep 20, 2007

### michael879

doesnt that make that program O(n)??

5. Sep 20, 2007

### CRGreathouse

While you're on the topic of brute-force factoring, here's a way to make the program much faster for numbers with small factors. (Without checking to see if factors are prime, you'll never factor numbers with large prime factors quickly.) This checks only two numbers every six (speeding it up by a factor of 3) and doesn't start over every time it finds a new factor.

Code (Text):
brute-force2(n)
{
if (n%2 == 0)
return CONCAT(2, brute_force2(n/2));
if (n%3 == 0)
return CONCAT(2, brute_force2(n/2));

return brute_force3(n, 5);
}

brute_force3(n, k)
for(i = k; i * i < n; i += 6)
{
if(n%i == 0)
return CONCAT(i, brute_force3(n/i, i));
if(n%(i+2) == 0)
return CONCAT(i+2, brute_force3(n/(i+2), i));
}
return n;
}

6. Sep 20, 2007

### CRGreathouse

O(n) but O(k^b). Being polynomial in n is easy; being polynomial in b is hard (probably impossible).

7. Sep 20, 2007

### michael879

o are you saying that factoring run times are based on log2(n) rather than n? well no wonder theyre all exponential... your taking a function thats polynomial and logging x...

8. Sep 20, 2007

### michael879

ok so yea my function is O(2E-09e^1.3102b) then. thanks for the clarification.

9. Sep 20, 2007

### michael879

so what the problem really is is to find an algorithm that factors in O(log(n))? That clears up a lot...

10. Sep 20, 2007

### CRGreathouse

The best general-purpose factoring algorithms now known are slower than O(b^k) for all k but faster than O(k^b) for all k > 0. They're subexponential but superpolynomial.