Trial division Primality

  • #1

Main Question or Discussion Point

Greetings,

This is very simple, and obviously i'm missing something trivial, but I still don't get it.

An algorithm is said to run in polynomial time if it runs in O(n^K) time.

That's why dijkstra for example, who runs in O(V^2), runs in polynomial time.

That said, my doubt is the following:
If a brute-force trial division primality check algorithm, runs in O(n^(1/2)), why it isn't considered in polynomial time?

Thanks beforehands,
Mourinho
 

Answers and Replies

  • #2
Hurkyl
Staff Emeritus
Science Advisor
Gold Member
14,916
19
The running time of an algorithm is usually expressed in terms of the size of the input data.

If you want to test N for primality, the amount of data you need to need to give your algorithm is O(log N). (e.g. if your algorithm works in base 10, the size of the input is the number of decimal digits it takes to write N, so the input size is (log N) / log(10))

So, the size of the input is (log N), and the amount of work necessary is N^(1/2)... which is exponential in the size of the input.



(Of course, it would be accurate to say that the running time is polynomial in N)
 
  • #3
CRGreathouse
Science Advisor
Homework Helper
2,820
0
The AKS primality test takes O(n6) time* to check if an n-bit number is prime. Trial division takes O(2n/2).

* The exponent depends on the version used. I think the original was O(n12) time and the best versions so far are conjectured to run in O(n4) time.
 

Related Threads on Trial division Primality

  • Last Post
Replies
6
Views
827
  • Last Post
Replies
7
Views
536
Replies
3
Views
1K
  • Last Post
Replies
4
Views
1K
  • Last Post
Replies
2
Views
7K
  • Last Post
Replies
9
Views
3K
  • Last Post
Replies
13
Views
1K
  • Last Post
Replies
4
Views
576
Replies
6
Views
24K
Replies
8
Views
1K
Top