I'm working on a problem that includes testing large numbers for primality. I'm using Pari/GP's built-in Baillie-PSW pseudoprimality test, but it (understandably!) runs slowly. I have some qustions, then: * I've been sieving the small prime factors from these numbers. For numbers if the size I'm working with (3300 to 3500 decimal digits), how far should I sieve? I've been checking up to a million, but should I go higher? As it is, I haven't been catching much this way (perhaps 1 in 10), since the numbers are fairly rough (similar to primorial primes). How much effort should I invest in this stage? * Are there better tests for what I'm doing? If it means anything, I expect "most" (~99.5%) of the numbers to be composite. I would think the p-1 and perhaps p+1 tests would be useful, but are they fast enough if I know the relevant quantity is 'very smooth'? * Are there other things I should be aware of when working with numbers of this magnitude?