Register to reply

Practical pseudoprimality testing

by CRGreathouse
Tags: practical, pseudoprimality, testing
Share this thread:
CRGreathouse
#1
Dec1-05, 10:24 PM
Sci Advisor
HW Helper
P: 3,684
I'm working on a problem that includes testing large numbers for primality. I'm using Pari/GP's built-in Baillie-PSW pseudoprimality test, but it (understandably!) runs slowly.

I have some qustions, then:
* I've been sieving the small prime factors from these numbers. For numbers if the size I'm working with (3300 to 3500 decimal digits), how far should I sieve? I've been checking up to a million, but should I go higher? As it is, I haven't been catching much this way (perhaps 1 in 10), since the numbers are fairly rough (similar to primorial primes). How much effort should I invest in this stage?
* Are there better tests for what I'm doing? If it means anything, I expect "most" (~99.5%) of the numbers to be composite. I would think the p-1 and perhaps p+1 tests would be useful, but are they fast enough if I know the relevant quantity is 'very smooth'?
* Are there other things I should be aware of when working with numbers of this magnitude?
Phys.Org News Partner Science news on Phys.org
Scientists develop 'electronic nose' for rapid detection of C. diff infection
Why plants in the office make us more productive
Tesla Motors dealing as states play factory poker
Hurkyl
#2
Dec1-05, 10:46 PM
Emeritus
Sci Advisor
PF Gold
Hurkyl's Avatar
P: 16,091
* I've been sieving the small prime factors from these numbers. For numbers if the size I'm working with (3300 to 3500 decimal digits), how far should I sieve? I've been checking up to a million, but should I go higher? As it is, I haven't been catching much this way (perhaps 1 in 10), since the numbers are fairly smooth. How much effort should I invest in this stage?
Empiricism is the best guide.

What you need to do is to get a reasonable estimate of how much work it takes to sieve vs how long it takes to run the rest of your algorithm on what survives. If you do this with smaller values, you should be able to get a good model which you can extrapolate to the problem size of interest.

Once you have a good model, you can simply find the parameters which yield the best running time.


* Are there better tests for what I'm doing? If it means anything, I expect "most" (~99.5%) of the numbers to be composite.
Well, Wikipedia and Mathworld both have pages on primality testing, and they're usually good sources of information. Mathworld seems to advocate the elliptic curve method, but Wikipedia asserts that the cyclotomy test is better for numbers with fewer than 10,000 digits.

If a probabilistic algorithm is sufficient for your purposes, they both seem to advocate the Rabin-Miller strong pseudoprime test.

I'm not familiar with "Pari", "GP", or "Ballie-PSW" at all, so I can't say anything about that.

* Are there other things I should be aware of when working with numbers of this magnitude?
You might consider that there's a nonzero chance of hardware failure with long-running programs, so a faster probabilistic algorithm might actually be more reliable than a longer-running determinsitic one. It sounds like you're using a probabilistic algorithm anyways, though.

Um... I suppose you could also try out various large number packages, and try to find the one that seems to perform the best -- it's possible that some packages are much better than others for this type of thing. Besides, they probably even come with well-optimized primality proving functions!
C1ay
#3
Dec1-05, 11:03 PM
P: 16
You might take a look at the source code available from GIMPS. They've been searching for the world's largest primes for years now and their code has been tweaked for efficiency.

CRGreathouse
#4
Dec1-05, 11:09 PM
Sci Advisor
HW Helper
P: 3,684
Practical pseudoprimality testing

Quote Quote by Hurkyl
Mathworld seems to advocate the elliptic curve method, but Wikipedia asserts that the cyclotomy test is better for numbers with fewer than 10,000 digits.
I've heard of the cyclotomy test, but I've never actually seen it implemented or even seen a good explanation of how it works.

Quote Quote by Hurkyl
If a probabilistic algorithm is sufficient for your purposes, they both seem to advocate the Rabin-Miller strong pseudoprime test.
"A Baillie-Pomerance-Selfridge-Wagstaff pseudoprime [is a] (strong Rabin-Miller pseudo prime for base 2, followed by strong Lucas test for the sequence (P,-1), P smallest positive integer such that P^2-4 is not a square mod x)."

Quote Quote by Hurkyl
I'm not familiar with "Pari", "GP", or "Ballie-PSW" at all, so I can't say anything about that.
Really? PARI is a fast calculation engine, and GP is a calculator/programming interface for PARI. I use it fairly often.
http://pari.math.u-bordeaux.fr/
CRGreathouse
#5
Dec1-05, 11:13 PM
Sci Advisor
HW Helper
P: 3,684
Quote Quote by C1ay
You might take a look at the source code available from GIMPS. They've been searching for the world's largest primes for years now and their code has been tweaked for efficiency.
I think that their code is designed for Mersenne primes, not for general use.


Register to reply

Related Discussions
Practical Uses for Lasers... Electrical Engineering 11
A practical reaction? Chemistry 4
Practical help Introductory Physics Homework 7
Practical questions Beyond the Standard Model 3
Help! Lab practical! Materials & Chemical Engineering 0