# Whats the best software to calculate very large numbers?

1. Dec 3, 2011

### clearwater304

For instance, the largest known prime number has nearly 13 million decimal digits. Would any normal computer be able to calculate this?

2. Dec 3, 2011

### mathman

I don't know what you mean by a "normal" computer. Obviously it took a computer to get this number.

3. Dec 3, 2011

### phyzguy

I think you mean the Mersenne prime 2^43112609 -1. Mathematica running on my laptop calculated this number in under 2 minutes, so the answer to your question is yes.

4. Dec 3, 2011

### clearwater304

Awsome. I know when I took java a few years ago a lot of software would only store numbers up to a certain number of digits.

5. Dec 4, 2011

### clearwater304

Is there a way I can run mathematica remotely on their server. If I try to do a primality test on a very large number on my notebook it gives me a recursive error. I suppose this is due to the fact that it ran out of memory and I only have 4gb on my notebook.

6. Dec 4, 2011

### Office_Shredder

Staff Emeritus
Most programming languages have limited size integer values because they just block off a certain (small) size of memory.

You can use java's BigInteger class to store numbers of arbitrarily large size (up to your machine's memory limit I guess)

7. Dec 5, 2011

### Bill Simpson

I don't know anything about these folks, but they seem to offer Mathematica service.

http://www.nimbisservices.com/catalog/cloud-services-mathematica [Broken]

Last edited by a moderator: May 5, 2017
8. Dec 5, 2011

### clearwater304

Mathematica seems to be the way to go, now I just have to create a customized primality test so it doesn't take a year to prove.

9. Dec 5, 2011

### phyzguy

What are you trying to do? Calculating a number like this is much, much faster than proving that it is prime. Encryption algorithms rely on the fact that multiplying two large primes to generate a large composite is much (much) faster than extracting these two primes from the resulting composite.

10. Dec 6, 2011

### clearwater304

Theres this prize given by the eff for calculating a prime with 1 billion digits. I want to develop a primality test to make the computing time more efficient.

11. Dec 7, 2011

### Feryll

I'm don't think that you meant this, but it should be reiterated that one does not ever need to find a factor of a number to prove that it's composite, even for extraordinarily large numbers.

12. Dec 8, 2011

### TylerH

Solving 10^13000000=2^x gives x~40000000, which means it will take about 40 million bits to represent the number, which would be 5 million bytes. Since most computers have on the order of billions of bytes (gigabytes) the memory isn't the problem. The problem is the computation.

Good luck with that. You might want to look at some other the existing ones first. The AKS primality test is the best known primality test.

You do it you want your algorithm to be anywhere near efficient. EDIT: Never mind, I was wrong. I was presuming you were referring to the Wilson test, but I just remembered the quadratic residue test, which is a very efficient way to eliminate a lot of composite numbers without finding a factor.

GMP, a C and C++ library, can handle arbitrarily large numbers and is very heavily optimized. So can Java's BigInteger, but I don't know how optimized it is.

Last edited: Dec 8, 2011
13. Dec 13, 2011