Practical pseudoprimality testing

  • Context: Graduate 
  • Thread starter Thread starter CRGreathouse
  • Start date Start date
  • Tags Tags
    Practical Testing
Click For Summary
SUMMARY

This discussion focuses on optimizing primality testing for large numbers, specifically using the Baillie-PSW pseudoprimality test in Pari/GP. The user is currently sieving small prime factors up to one million for numbers with 3300 to 3500 decimal digits but is advised to evaluate the efficiency of this approach. Recommendations include exploring the Rabin-Miller strong pseudoprime test and the elliptic curve method, as well as considering the cyclotomy test for numbers under 10,000 digits. The importance of modeling the sieve versus algorithm runtime is emphasized for better performance.

PREREQUISITES
  • Familiarity with the Baillie-PSW pseudoprimality test
  • Understanding of the Rabin-Miller strong pseudoprime test
  • Knowledge of elliptic curve methods for primality testing
  • Experience with Pari/GP programming environment
NEXT STEPS
  • Research the implementation of the Rabin-Miller strong pseudoprime test
  • Explore the elliptic curve method for primality testing
  • Investigate the cyclotomy test and its applications for large numbers
  • Examine the source code from GIMPS for efficient primality testing techniques
USEFUL FOR

Mathematicians, computer scientists, and software developers involved in number theory, particularly those focused on primality testing and optimization of algorithms for large integers.

CRGreathouse
Science Advisor
Homework Helper
Messages
2,832
Reaction score
0
I'm working on a problem that includes testing large numbers for primality. I'm using Pari/GP's built-in Baillie-PSW pseudoprimality test, but it (understandably!) runs slowly.

I have some qustions, then:
* I've been sieving the small prime factors from these numbers. For numbers if the size I'm working with (3300 to 3500 decimal digits), how far should I sieve? I've been checking up to a million, but should I go higher? As it is, I haven't been catching much this way (perhaps 1 in 10), since the numbers are fairly rough (similar to primorial primes). How much effort should I invest in this stage?
* Are there better tests for what I'm doing? If it means anything, I expect "most" (~99.5%) of the numbers to be composite. I would think the p-1 and perhaps p+1 tests would be useful, but are they fast enough if I know the relevant quantity is 'very smooth'?
* Are there other things I should be aware of when working with numbers of this magnitude?
 
Last edited:
Physics news on Phys.org
* I've been sieving the small prime factors from these numbers. For numbers if the size I'm working with (3300 to 3500 decimal digits), how far should I sieve? I've been checking up to a million, but should I go higher? As it is, I haven't been catching much this way (perhaps 1 in 10), since the numbers are fairly smooth. How much effort should I invest in this stage?
Empiricism is the best guide. :smile:

What you need to do is to get a reasonable estimate of how much work it takes to sieve vs how long it takes to run the rest of your algorithm on what survives. If you do this with smaller values, you should be able to get a good model which you can extrapolate to the problem size of interest.

Once you have a good model, you can simply find the parameters which yield the best running time.


* Are there better tests for what I'm doing? If it means anything, I expect "most" (~99.5%) of the numbers to be composite.
Well, Wikipedia and Mathworld both have pages on primality testing, and they're usually good sources of information. Mathworld seems to advocate the elliptic curve method, but Wikipedia asserts that the cyclotomy test is better for numbers with fewer than 10,000 digits.

If a probabilistic algorithm is sufficient for your purposes, they both seem to advocate the Rabin-Miller strong pseudoprime test.

I'm not familiar with "Pari", "GP", or "Ballie-PSW" at all, so I can't say anything about that.

* Are there other things I should be aware of when working with numbers of this magnitude?
You might consider that there's a nonzero chance of hardware failure with long-running programs, so a faster probabilistic algorithm might actually be more reliable than a longer-running determinsitic one. It sounds like you're using a probabilistic algorithm anyways, though.

Um... I suppose you could also try out various large number packages, and try to find the one that seems to perform the best -- it's possible that some packages are much better than others for this type of thing. Besides, they probably even come with well-optimized primality proving functions!
 
Last edited:
You might take a look at the source code available from GIMPS. They've been searching for the world's largest primes for years now and their code has been tweaked for efficiency.
 
Hurkyl said:
Mathworld seems to advocate the elliptic curve method, but Wikipedia asserts that the cyclotomy test is better for numbers with fewer than 10,000 digits.

I've heard of the cyclotomy test, but I've never actually seen it implemented or even seen a good explanation of how it works.

Hurkyl said:
If a probabilistic algorithm is sufficient for your purposes, they both seem to advocate the Rabin-Miller strong pseudoprime test.

"A Baillie-Pomerance-Selfridge-Wagstaff pseudoprime [is a] (strong Rabin-Miller pseudo prime for base 2, followed by strong Lucas test for the sequence (P,-1), P smallest positive integer such that P^2-4 is not a square mod x)."

Hurkyl said:
I'm not familiar with "Pari", "GP", or "Ballie-PSW" at all, so I can't say anything about that.

Really? PARI is a fast calculation engine, and GP is a calculator/programming interface for PARI. I use it fairly often.
http://pari.math.u-bordeaux.fr/
 
C1ay said:
You might take a look at the source code available from GIMPS. They've been searching for the world's largest primes for years now and their code has been tweaked for efficiency.

I think that their code is designed for Mersenne primes, not for general use.
 

Similar threads

  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 6 ·
Replies
6
Views
4K
  • · Replies 5 ·
Replies
5
Views
2K
Replies
14
Views
3K
  • · Replies 5 ·
Replies
5
Views
3K
  • · Replies 11 ·
Replies
11
Views
2K
  • · Replies 11 ·
Replies
11
Views
2K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 1 ·
Replies
1
Views
3K