# Trial division Primality

## Main Question or Discussion Point

Greetings,

This is very simple, and obviously i'm missing something trivial, but I still don't get it.

An algorithm is said to run in polynomial time if it runs in O(n^K) time.

That's why dijkstra for example, who runs in O(V^2), runs in polynomial time.

That said, my doubt is the following:
If a brute-force trial division primality check algorithm, runs in O(n^(1/2)), why it isn't considered in polynomial time?

Thanks beforehands,
Mourinho

Hurkyl
Staff Emeritus
Gold Member
The running time of an algorithm is usually expressed in terms of the size of the input data.

If you want to test N for primality, the amount of data you need to need to give your algorithm is O(log N). (e.g. if your algorithm works in base 10, the size of the input is the number of decimal digits it takes to write N, so the input size is (log N) / log(10))

So, the size of the input is (log N), and the amount of work necessary is N^(1/2)... which is exponential in the size of the input.

(Of course, it would be accurate to say that the running time is polynomial in N)

CRGreathouse