I'm working on a problem that includes testing large numbers for primality. I'm using Pari/GP's built-in Baillie-PSW pseudoprimality test, but it (understandably!) runs slowly.(adsbygoogle = window.adsbygoogle || []).push({});

I have some qustions, then:

* I've been sieving the small prime factors from these numbers. For numbers if the size I'm working with (3300 to 3500 decimal digits), how far should I sieve? I've been checking up to a million, but should I go higher? As it is, I haven't been catching much this way (perhaps 1 in 10), since the numbers are fairly rough (similar to primorial primes). How much effort should I invest in this stage?

* Are there better tests for what I'm doing? If it means anything, I expect "most" (~99.5%) of the numbers to be composite. I would think the p-1 and perhaps p+1 tests would be useful, but are they fast enough if I know the relevant quantity is 'very smooth'?

* Are there other things I should be aware of when working with numbers of this magnitude?

**Physics Forums | Science Articles, Homework Help, Discussion**

Join Physics Forums Today!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

# Practical pseudoprimality testing

**Physics Forums | Science Articles, Homework Help, Discussion**