Exploring Prime Numbers: Digit Counts

In summary, the conversation is about the recent discovery of a new Mersenne prime and the interest it has sparked in prime numbers. The discussion covers topics such as the "digits" column in prime number tables, the concept of Mersenne primes, and the search for larger Mersenne primes. The conversation also touches on the use of prime numbers in cryptography and the capabilities of code crackers to factor large numbers.
  • #1
endfx
10
0
So I just heard about the new prime number that was discovered and for some reason got kind of interested in it.

So I'm looking at prime number tables on various webpages that show the prime numbers, dates discovered, etc.

I'm confused on what the "digits" column in these tables means.

For example, the prime number 5 has 2 digits, and the prime number 13 has 4 digits. What are these digit numbers?
How do you get 2 digits from the prime number 5?

Thanks.
 
Physics news on Phys.org
  • #2
Was this the list you were looking at? It doesn't say that 5 has 2 digits or that 13 has 4 digits, it says that 2^5 - 1 (i.e. 31) has 2 digits and that 2^13 - 1 (i.e. 8191) has 4 digits.
 
  • #3
The tables I was looking at were like that, but that wasn't the particular page I was viewing. But thanks for explaining it to me, as well, thanks a lot for that link. It like how it explains the history of primes and why they are imporant.
:smile:
 
  • #4
endfx, you might want to Google "Mersenne Primes"
 
  • #5
just read a on the net that there are only 41 such numbers!

how come... can't you just insert any prime number into 2^p -1??

so if we want to find a BIG one we can take the present biggest and insert it into 2^p-1...or?
 
  • #6
strid said:
just read a on the net that there are only 41 such numbers!

how come... can't you just insert any prime number into 2^p -1??

so if we want to find a BIG one we can take the present biggest and insert it into 2^p-1...or?
Not all prime p in the above formula give Mersenne Primes. For instance (2^11)-1 = 2047 = 23*89
 
  • #7
strid said:
just read a on the net that there are only 41 such numbers!

42 known now, see the GIMPS webpage (there's a recent post in this forum about it too). The known is an important distinction, there may be many more (possibly infinitely many).
 
  • #8
shmoe said:
42 known now, see the GIMPS webpage (there's a recent post in this forum about it too). The known is an important distinction, there may be many more (possibly infinitely many).

[tex]2^{25,964,951}-1[/tex]

It has 7,816,230 digits...yikes! :eek:

You can fill a phone book with it (I think).
 
Last edited:
  • #9
Galileo said:
[tex]2^{225,964,951}-1[/tex]

It has 7,816,230 digits...yikes! :eek:

You can fill a phone book with it (I think).
I think you'll find you've added an extra 2 there and I really doubt it about the phone book, consider in England, way way over 1 million people have phones and we all have 11 digits (including area code).
 
  • #10
Why is it yet impossible to devise a function which correlates a number with a prime number?
 
  • #11
Icebreaker said:
Why is it yet impossible to devise a function which correlates a number with a prime number?

Why do you say it's impossible? let p(n)=the nth prime number, there's your function. There are also plenty of formulas that spit out primes, none really computationally useful though.


I took a look at my local phonebook (Toronto area) and it appears to hold something like 25,000 characters per page (roughly 200 across and 125 lines per page). So this new Mersenne prime would be over 300 pages, but the book itself has over 2000 pages. So it falls shot of the toronto phonebook size, but it's probably on par with some smaller Canadian cities.
 
  • #12
let p(n) = the nth prime number is useless in predicting the nth prime, if n > the largest prime we know
 
  • #13
Icebreaker said:
let p(n) = the nth prime number is useless in predicting the nth prime, if n > the largest prime we know

It's worse than that. We don't come anywhere near knowing all the primes less than the largest known one. My point was there are functions that map the naturals to the primes, there are even ones that look less cheap then my p(n) one. There are also asymptotics for p(n), and other formulas who output only primes, and that nasty polynomial in many (26?) variables whose positive values equals the set of primes. All are intersting, none really computationally friendly. Here's a nice collection from mathworld:

http://mathworld.wolfram.com/PrimeFormulas.html
 
  • #14
I took a look at my local phonebook (Toronto area) and it appears to hold something like 25,000 characters per page (roughly 200 across and 125 lines per page). So this new Mersenne prime would be over 300 pages, but the book itself has over 2000 pages. So it falls shot of the toronto phonebook size, but it's probably on par with some smaller Canadian cities.
Dutch phonebooks aren't that big :biggrin:
I checked. About 6000 numbers will fit per page and we have about 1000 pages. So roughly 6,000,000 digits will fit.
So in retrospect, the new Mersenne prime will fill a dutch phone book. :tongue:
 
  • #15
10 millions characters.

Galileo said:
It has 7,816,230 digits...yikes! :eek:
Since there is a prize of $ 100.000 for the discoverer of a Mersenne prime number of more than 10 millions characters, GIMPS addicts are now searching with exponents larger than 34.000.000 . So, maybe M43 (or M44) could not fit in any phonebook on Earth ...
See: Internet PrimeNet Server and : GIMPS .
Tony
 
  • #16
I think it useful to discuss what are the general capacities of "code crackers" to find prime factors in "large numbers." Well, about 20 years ago it was generally thought that if two 100 digit primes were multiplied together it was as a practical matter impossible to factor this 200 digit number. This fact was used in constructing secret codes.

Well, today on the internet I found this: This function creates keys using the method described in the Procedure section. It first generates two 100-digit prime numbers p and q by initializing them both to 10100 and incrementing them by 1 and -1, respectively. Each time they are incremented, they are tested for primality. In this way, p and q were found to be:...

The writer then is suggesting that as a general rule such 200 digit numbers can not be factored as a practical matter. http://ashvin.flatirons.org/projects/crackingthecode/results.html

You have to understand that a Mersenne prime is a special case where mathematicians and programmers use special criterion to work with.
 
Last edited by a moderator:
  • #17
robert Ihnot said:
You have to understand that a Mersenne prime is a special case where mathematicians and programmers use special criterion to work with.
True. But the Maths used for proving the primality of Mersenne numbers can be used for proving the primality of other kinds of numbers, like Fermat numbers, or many other numbers. But this kind of Maths seems to be less and less used and teached, as far as I know (but I'm NOT a mathematician). Don't know why.
Tony
 
  • #18
We have to also realize that code crackers are clever people, and if they know or suspect that two 100 digit numbers were used to generate the 200 digit product, it narrows down their search for factors. i.e. factoring a 200 digit number that was generated by msomeone purposely from two 100 digit numbers, is not as difficult as factoring an arbitrary 200 digit number. this is a current topic of research.

What Robert is saying is that if they also know the number is a Mersenne number, then the job of factoring it is even more restricted.

In fact 10 or 20 years ago, even factoring special 100 digit numebrs was pretty tough. But one of my friends programmed a chain of PC's to do it in a few months, working entirely at night when the PC's were not in use.

You also need to realize that the difficulty of proving a big number is prime is much much less than that of factoring a non prime number.

I.e. there are tests for primality that run hugely faster than factoring algorithms, due to special proeprties of primes numbers like the "Fermat little theorem". This particular property does not quite characterize primes, since some non prime numbers also obey the conclusion of fermats little theorem, but a small enhancement of it does so.

now there are much faster algortihms using elliptic curves, etc... that detect prime numbers.

so the job of proving a certain mersenne number is prime is nothing like the task of factoring one that is not prime.
 
Last edited:
  • #19
Most people have difficulity understanding the vastness of numbers like a google. A 200 digit number is a google squared, not two google.

Suppose, we can check one number every second for primality of a 100 digit number, and say we check every 10th number, or better yet, every 100th number to use a figure, we would, using 365 days/yr, be working for 3.17 x 10^89 YEARS to check every such number for primality!

That's 10^99/(60x60x24x365x100) = 3.17x10^89!
 
Last edited:
  • #20
There are less naive tests for primality, though. The best run in time that is polynomial in the number of digits, not in the number's size.
 
  • #21
here is a site where one can download a pdf article on primality testing, by andrew granville.

http://www.math.gatech.edu/~mbaker/Math4150.html
 
  • #22
Hurkyl said:
There are less naive tests for primality, though. The best run in time that is polynomial in the number of digits, not in the number's size.

I think it's been accepted for a few years now that primality testing is in P.

http://www.cse.iitk.ac.in/news/primality.html

There are two pdfs available from the website above which discuss the method. Note that theorem 5.1 in either the original or revised paper suggest an upper bound of [tex]\log^{12} n [/tex]

Unfortunately I can't understand the number theory :grumpy:


EDIT: You know, on second thought, it seems like you may already be aware of that link/result; in which case, just ignore it. But I don't follow you when you say that the best algorithms run in time polynomial in the number of digits, but not in the number's size. That's what kind of threw me.

I'll leave the "primes in P" link stuff up just in case anyone else is interested.
 
Last edited by a moderator:
  • #23
CrankFan said:
EDIT: You know, on second thought, it seems like you may already be aware of that link/result; in which case, just ignore it. But I don't follow you when you say that the best algorithms run in time polynomial in the number of digits, but not in the number's size. That's what kind of threw me.

I'll leave the "primes in P" link stuff up just in case anyone else is interested.

This is exactly the type of result Hurkyl is referring to. Think of the naive method of testing primality of an integer n, you can do trial division by every interger less than [itex]\sqrt{n}[/itex], so this takes [itex]\sqrt{n}[/itex] steps. If d is the number of digits of n, this is about [itex]10^{d/2}[/itex] steps. So this algorithm would be considered polynomial time in the size (or absolute value) of n, but exponential in the number of digits. (note log(n) is proportional to d). The original AKS algorithm runs in time [itex]d^{12}[/itex], polynomial time in the number of digits of n. (the exponent 12 is down to 6 I believe, due to refinements by Lenstra and Pomerance, see the article by A.Granville for a nice survey)
 
  • #24
shmoe said:
This is exactly the type of result Hurkyl is referring to. Think of the naive method of testing primality of an integer n, you can do trial division by every interger less than [itex]\sqrt{n}[/itex], so this takes [itex]\sqrt{n}[/itex] steps. If d is the number of digits of n, this is about [itex]10^{d/2}[/itex] steps. So this algorithm would be considered polynomial time in the size (or absolute value) of n, but exponential in the number of digits.

Right, but in that case we don't have an algorithm that is polynomial in log n and not polynomial in n.

But I now see what he meant. If I put a "just" between "not" and "in", it all makes sense: "The best run in time that is polynomial in the number of digits, not [just] in the number's size."
 
  • #25
Right now binary factorization is making some huge progress, so id stop short of saying there were ANY practical limits to factorizations. especially with how commonly computers make huge advances in processing power and memory size.

right now, there are no algorithmic methods of calculating primes, or obviously, we could just generate the largest ones quickly. the fact that these numbers obey no known pattern besides a general one of becoming less frequent when distributed against the integer set from 0 on. that comparison is pointless, since the composite numbers don't belong with the primes in a set used for finding JUST PRIMES through algorithm.

wasnt there a proof that mersenne primes had to have a prime as an exponent? i can't remember where i saw that, but I am pretty sure its true. this means that only primes are used in generating mersenne primes, but they still produce composite numbers. so, we get closer, but...

its interesting that when fermat found his first numbers F0-F4 he assumed, incorrectly, that all Fn would be primes, since the first 4 were prime.

its also interesting that in the 1900s factorization algorithms had to still be done by hand, and one guy who factored M67, previously thought to be factorable by the famous Edouard Lucas, whos algorithms for finding factorability, not the actual factors, are still used in Prime95 today. The guy, named Frank Cole, did the factoring by hand, over the course of 20 years of sunday afternoons. the factors of M67 are 193,707,721 x 761,838,257,287. that's huge.

T Rex> Mx isn't the xth mersenne prime, x is the exponent. Mx means (2^x)-1
 
  • #26
abertram28 said:
right now, there are no algorithmic methods of calculating primes, or obviously, we could just generate the largest ones quickly.

What do you mean by "calculate primes" here? There are plenty of agorithms that produce primes, maybe not as fast as we'd like but they are there.

abertram28 said:
wasnt there a proof that mersenne primes had to have a prime as an exponent? i can't remember where i saw that, but I am pretty sure its true.

Yes, if the exponent is composite, say nm, just factor [tex]2^{nm}-1=(2^m)^n-1=(2^m-1)(2^{m(n-1)}+2^{m(n-2)}+\ldots+2^m+1)[/tex]
 
  • #27
there are algorithms that plug in single integers and ALWAYS produce primes? could you provide an example? I am having trouble seeing that one being possible. i mean, i believe that one might be possible using a table of primes as input, but that isn't really an algorthim, unless the algorithm also produced those primes by the same process, starting with just 2 and 3. if so, id be very interested to play with it! does 1 count as a prime when starting these sequences? is 1 prime? I am dumb.

ah, yes, I've actually used that factorization on the forms of the "odd perfect numbers", should they ever exist. thanks for that one.
 
  • #28
Of course. For example, there is the algorithm that always returns "2". Another example is one that, given a positive integer n, it iterates through the integers bigger than n, testing each for primality, and returns the first that passes the test.
 
  • #29
abertram28 said:
there are algorithms that plug in single integers and ALWAYS produce primes?

There are many, see http://mathworld.wolfram.com/PrimeFormulas.html for a few different ones. Some give only a subset of the primes, some give all primes. None are really convenient to work with though.

1 is generally not considered a prime these days.
 
  • #30
ah, i see. none of these are basic algebraic algorithms. none of them are algebraic and use even a restricted input set! the iterative ones arent really even producing the prime, they are producing forms and testing for primality, its really a multifunction program rather than a single algorithm. but it still is an algorithm all on its own too. these really digress to the simple algorithm, if p=1, then n is prime; p=1 if n-PHI(n)=1. this is meaningless though, since its just a simple definition with a number theoretic function.

on 200 digit numbers, my calculator, TI-89 ti, displays F9 in raw data. its 155 digits or something. it also factors F6 pretty quickly. i mean, quickly for a handheld battery powered machine!

thanks for the prime formulas, ill be sure to print some of them and take them on my spring break road trip. it sucks that none of these are really useful. that's what i expected. none of these algorithms are really generators of primes, rather just indicators. i mean, as far as useful ones go. the simple number "2" being an algorithm is useless, though it might fit some arbitrary definition of algorithm.
 
  • #31
you seem to ignore the fact that these procedures do produce lots of large primes rather easily, even if they do not obey the rules you would like them to. the point is there is no problem about finding lots of large primes.

so you seem more interested in the process than the result?
 
  • #32
hey, I am not ignoring the fact that these procedures produce primes.
the fact that you think they are large and that the process is easy is opinion, nothing more. if some of these methods are capable of finding all the primes up to some "large prime" so easily, why do they have to include tests of primality? the algorithm either produces primes or it produces possible primes, and if it doesn't exclusively produce the inclusive list of primes without testing for primality as a part of the computation, it isn't that effective, else the largest prime known wouldn't have taken 50 days to compute, it would have just been generated easily.

im not sure how you deem that these algorithmic solutions that test for primality are "easy". I am quite sure they are easy to understand, but easy to compute? hardly.

i am interested in the process, its the part that makes the procudure "easy" or "hard".
how can you claim the result is something that process is not? if the procedure is tedious, time consuming, and requires massive computing power, how can the result be "rather easy"? testing primality is pretty refined computing wise, but it still takes years to test 10 million digit numbers. they arent just generated, they are produced and then tested by these algorithmic solutions. the test isn't part of the algorithm really, since almost all of the primality tests are the standard test, Lucas set these out over a hundred years ago in basic form.

large is such a relative term, there will always be primes much larger than the ones we are currently testing, and so to say that these numbers are large assumes some initial frame. for me, that frame is referencing when it becomes quicker to look the number up in a table rather than test it. if the number has been tested once, its done. how do previous tests or any previous piece of information make these tests for primality faster based on a known prime table? as far as i know, the largest prime number known doesn't help in finding the next prime after it, and so the calculations always become increasingly tedious and time consuming. and thus, they cannot be easy when applied to something they need to do.

as long as the algorithm uses a test for primality or produces only known primes, its useless as far as technology or finding the next prime number.

i do, in fact, see a problem with finding lots of 10 million digit primes. its going to take forever just to test them, producing them is easy, producing them exclusively and inclusively is not. producing all of them without testing all of them is impossible. to me, that means the opposite of everything you just said.

*EDIT* let me add, that i think the current method for solving primality problems are quite elegant and efficient, my point is that there are always larger primes to be plugged and chugged, making the calculations always necessarily difficult and tedious. I am a fan of the current methods. i never really mentioned any specific problem with the algorithms, aside from the fact that there isn't any use over the known basic forms of primes and a primality test. producing a number and testing a number through brute force are not the same thing, in my mind. producing a number and knowing its prime without testing it was my idea of a "prime producing algorithm" that produces all the primes as it goes up, such that no prime is left out. obviously there is the simple algorithm that just tests each number as you go up by 1. its not fast or easy. none of the mentioned algorithms are doing anything but eliminating numbers based on them being composite, and then doing the same exact thing as testing every single number. thus, they are not fast or easy either. just my opinion. please, watch your tone. is there a an algorithm that doesn't include testing for primality and gives a prime number of increasing size for each increased input?

i don't claim to be all knowing or even having an extensive knowledge on the subject. i do claim to know that the definitions of large are relative. even finding the primality of a mersenne, as you said one of the more easily determined primes, takes days and days when you look at unknown primes. none of these algorithms are faster than a table lookup until you get so large that neither a table lookup or an algorithm are all that quick. still, a table is much much faster. it cannot be said that these algorithmic methods produce large primes, let's use mersenne primes for example, quickly. it took two years with thousands of computers testing to find one, and that was in 1998. though computers are a lot faster today, so are the number of digits in the numbers we are testing. so, it can still be said, that the same methods that took tens of thousands of computers years to test enough to find one prime are in use today. another simple fact, you can't generate a mersenne prime of any given size, say with an exponent higher than the highest known prime, without first testing its exponent. this alone should stick out as a problem. without a table of known primes or without searching all these larege numbers for primes first, the next larger ones cannot even be generated. how that is fast and easy, is beyond me. *EDIT*
 
Last edited:
  • #33
abertram28 said:
thanks for the prime formulas, ill be sure to print some of them and take them on my spring break road trip. it sucks that none of these are really useful.

Hurkyl gave you an outline for a useful algorithm to generate prime numbers with his second suggestion. One should see that, in principal, it's not too hard to generate "large" prime numbers if one has an efficient way to test for primality.

There is no shortage of papers and books which discuss methods for generating large prime numbers because methods like that are used all the time for cryptography applications.

Maybe you should read up on the subject of generating prime numbers before you tell us, again, that it's impossible.
 
  • #34
exactly my point. if one has an efficient way to test for primality. that means that the algorithm isn't generating primes, but possible primes. simple as that. an equally useful algorithm is to take a=1, n=a-1, if a/n is an integer, a is not prime, else n=n-1 until n=2, if no n divides a, a is prime , a=a+1. this is nothing more than the basic definition of primality, and is NOT anything even remotely useful for generating primes.

i never said its impossible to generate primes. i said i didnt think there was an algorithm that generated ONLY primes that INCLUDED all the primes as it went up, and didnt require a primality test. those that require a primality test arent any more useful than just testing every number.

obviously the algorithms linked are much more complex than simply adding 1 and dividing by every smaller number. I am not that stupid. but none of them are definitely proven to produce every prime and only primes without testing for primality. primality test was the part that got me. more useful algorithms for GENERATING primes DONT involve testing some other generated number for primality. if they have to test the number, obviously the generator part of the algorithm ISNT producing primes, but only canidates.

as to hurkyls second suggestion, while a lot more useful than the algorithm "2", is still requiring testing for primality of a large group of numbers, and that is nothing worth being called fast or easy. in fact, since there is know known pattern, for extremely large numbers it could take an immense amount of time and effort.
 
Last edited:
  • #35
abertram28 said:
exactly my point. if one has an efficient way to test for primality. that means that the algorithm isn't generating primes, but possible primes. simple as that.

But that's false.

abertram28 said:
i never said its impossible to generate primes.

What you said was:

"right now, there are no algorithmic methods of calculating primes"

but of course there are algorithmic methods for calculate primes.
 

Similar threads

  • Linear and Abstract Algebra
Replies
2
Views
788
Replies
5
Views
382
Replies
8
Views
344
  • Linear and Abstract Algebra
Replies
6
Views
2K
  • General Math
Replies
3
Views
544
  • Linear and Abstract Algebra
Replies
2
Views
1K
Replies
1
Views
1K
  • Engineering and Comp Sci Homework Help
Replies
32
Views
3K
  • Engineering and Comp Sci Homework Help
Replies
5
Views
2K
Replies
3
Views
760
Back
Top