What is the fastest algorithm for finding large prime numbers?

Click For Summary

Discussion Overview

The discussion revolves around algorithms for finding large prime numbers, particularly in the context of RSA encryption. Participants explore various methods for efficiently identifying primes and the implications for cryptographic security.

Discussion Character

  • Exploratory
  • Technical explanation
  • Debate/contested
  • Mathematical reasoning

Main Points Raised

  • One participant inquires about algorithms used to find large prime numbers for RSA encryption, noting the computational challenges involved.
  • Another mentions that while many algorithms can quickly identify composite numbers, proving a number is prime is more complex.
  • Several participants reference the Miller-Rabin probabilistic algorithm as a method for quickly rejecting composite candidates, though they note that it does not certify primes.
  • There is a discussion about the trade-off between the certainty of primality and the speed of generation, suggesting that for practical cryptography, certified primes may not always be necessary.
  • One participant suggests that multiple algorithms should be employed to increase confidence in the primality of candidates.
  • Another mentions the AKS test as a method for certifying candidates from the Miller-Rabin test, but notes its practical limitations compared to other tests like ECPP and Baillie–PSW.

Areas of Agreement / Disagreement

Participants express various viewpoints on the efficiency and necessity of different primality tests, indicating that there is no consensus on the best approach. Some argue for the use of probabilistic methods, while others emphasize the need for more rigorous certification.

Contextual Notes

Participants highlight the complexity of proving primality and the computational resources required for different algorithms. There are also mentions of the trade-offs involved in cryptographic applications, such as the balance between speed and certainty.

Who May Find This Useful

This discussion may be useful for individuals interested in cryptography, number theory, and computational methods for prime number generation.

KingGambit
Messages
54
Reaction score
46
TL;DR
Public Key, Private Key, Prime Numbers, RSA, Encryption
Dear PF Forum,
Can someone help me with the algorithm for finding a very large prime number?
In RSA Encryption (1024 bit? 2048?, I forget, should look it up at wiki for that), Private Keys is a - two prime number packet.
Now, what I wonder is, what algorithm that the computer use to find those large prime number very fast?

Supposed one prime number is half of 1024, it's a 512 bit number, it's about 10^(512/10 *3) somewhere around 10^171.

And let's say we find some 10^171 number by random process, and if I were to check if it's prime or not, I would do a loop of 10 ^ 86, which is its square root.
Looping 10^86 is a very, very long time, and I see that Windows or Bitcoin node, give me a Private Key, relatively fast.

Even if we use array, which is very impossible, it would take computer with DDR 4 RAM as big as Milky Way I think.

I know, that I'm asking too much for an explanation here. But, I've tried to google the algorithm to find a very large number. I have found any luck so far.
So if someone perhaps know such link for that algorithm, I'd be really grateful.
Thank you.
 
Technology news on Phys.org
Have you tried Googling this?
One of the first hits led me to


i tried the Python script on the page (copy 'n' paste into Jupyter) and it takes a couple of seconds to generate a 1024 bit number, 2048 only takes a bit longer
 
  • Like
Likes   Reactions: KingGambit
KingGambit said:
Now, what I wonder is, what algorithm that the computer use to find those large prime number very fast?
There are many algorithms that will quickly reject a large random number if it is composite. It is more difficult to prove that a large number is prime.

The security of encryption systems is often based on the difficulty of factorising the product of two large primes.
Certifying that big numbers are prime must be very much faster than factorisation of their products, or the system would be more of a handicap to the user than to the enemy.
 
  • Like
Likes   Reactions: KingGambit
Baluncore said:
There are many algorithms that will quickly reject a large random number if it is composite. It is more difficult to prove that a large number is prime.

The security of encryption systems is often based on the difficulty of factorising the product of two large primes.
Certifying that big numbers are prime must be very much faster than factorisation of their products, or the system would be more of a handicap to the user than to the enemy.
I think you need to add a few 'probably's in there :wink:
 
Get a copy of; Prime Numbers a Computational Perspective.
By; Richard Crandall, Carl B. Pomerance; 2005.
See; Chapter 4. Primality Proving.
 
  • Like
Likes   Reactions: KingGambit
f95toli said:
i tried the Python script on the page (copy 'n' paste into Jupyter) and it takes a couple of seconds to generate a 1024 bit number, 2048 only takes a bit longer
The application of the Miller-Rabin probabilistic algorithm is good way of quickly rejecting composite numbers.
The survivors produced are strong prime candidates. They are NOT certified prime.
 
  • Informative
Likes   Reactions: Keith_McClary
f95toli said:
Have you tried Googling this?
One of the first hits led me to


i tried the Python script on the page (copy 'n' paste into Jupyter) and it takes a couple of seconds to generate a 1024 bit number, 2048 only takes a bit longer

Wow, thank you. I'll check it.
 
Baluncore said:
The security ... is based on the difficulty of factorising the product of two large primes.
Certifying that big numbers are prime must be very much faster...
Thanks Baluncore, it's the certifying that I'm trying to find out.
 
  • #10
Thank you very much every body.
 
  • #11
KingGambit said:
Thanks Baluncore, it's the certifying that I'm trying to find out.

But you don't need certified numbers for most practical cryptography; there is a trade-off between certainty ( likelihood that a number is prime) and speed; and for something like RSA you can probably live with a very, very small risk if it means you can generate large numbers on the fly whenever needed. It is still safer than e.g using a lookup table.
 
  • #12
Baluncore said:
There are many algorithms that will quickly reject a large random number if it is composite. It is more difficult to prove that a large number is prime.
Miller-Rabin is only one of the many tests that are available. More algorithms should be employed, more trials must be run. For real encryption, it will take longer than a few seconds to get the confidence and security required of an “industrial-grade prime”.
 
  • #13
I am not a number theorist, and it's been a while since I looked seriously at RSA, but...

I believe the only requirement for p and q is that they are relatively prime. However, if you pick a number that is composite, the brute-force attack time goes as the smallest factor of pq. As mentioned, there are many tests for "probable" primality, many of which give a quantitative measure of the likelihood the number is actually composite.

So, as a practical matter, does someone care if there is a 0.001% chance that a brute force attack takes 100 years rather than 100,000 years? Maybe, maybe not. I will say that in general making the most cryptographically secure section even stronger doesn't always help. When it becomes easier to bribe your way to get the key, that's what will happen.
 
  • Like
Likes   Reactions: KingGambit
  • #14
Once one has a candidate from Miller-Rabin, one can then certify (or reject) it with the AKS Test.

This video describes the jist of the test, and has good reference links in its description:

 
  • #15
The Bill said:
Once one has a candidate from Miller-Rabin, one can then certify (or reject) it with the AKS Test.
https://en.wikipedia.org/wiki/AKS_primality_test#Importance
"While the algorithm is of immense theoretical importance, it is not used in practice, rendering it a galactic algorithm.

For 64-bit inputs, the Baillie–PSW primality test is deterministic and runs many orders of magnitude faster.

For larger inputs, the performance of the (also unconditionally correct) ECPP and APR tests is far superior to AKS. Additionally, ECPP can output a primality certificate that allows independent and rapid verification of the results, which is not possible with the AKS algorithm. "
 
  • Like
Likes   Reactions: KingGambit

Similar threads

  • · Replies 36 ·
2
Replies
36
Views
1K
  • · Replies 28 ·
Replies
28
Views
5K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 13 ·
Replies
13
Views
5K
  • · Replies 30 ·
2
Replies
30
Views
6K
  • · Replies 32 ·
2
Replies
32
Views
5K
Replies
17
Views
5K