# Other "primes"

by -Job-
Tags: primes
 Sci Advisor P: 1,132 If, instead a defining a prime as an integer divisable only by 1 or itself, we define it as an integer not divisable by a number other than 1, 2, half of itself, or itself, (numbers not being divisable by 2 still "able" to be primes) then we generate a new "prime" sequence: 1, 2, 3, 4, 5, 6, 7, 8, 10, 11, 13, 14, 17, 19, 22, 23, 26, 27 .... etc This way we allow even numbers to be "prime". We could also define a prime as a number not divisable by any number other than 1, 2, 3, third of self, half of self, self. As we do so we are clearly increasing the number of "primes". The question then is, are any of these sequences any easier to predict than the traditional primes? If we let $$primes_{1}$$ be the traditional primes, then: $$primes_{1} = 1, 3, 5, 7, 11, 13, 17, 19, 23, 27 ...$$ $$primes_{2} = 1, 2, 3, 4, 5, 6, 7, 8, 10, 11, 13, 14, 17, 19, 22, 23, 26, 27 ...$$ $$primes_{3} = 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 17, 19, 21, 22, 23, 26, 27 ...$$ $$primes_{4} = ...$$ Do you think these sequences behave similarly after a while? If they're all as "unpredictable", then that means that there are at least an infinity of sequences that are as unpredictable as the primes.
PF Patron
Emeritus
P: 16,094
 If, instead a defining a prime as an integer divisable only by 1 or itself
Technically, that's irreducible. For p to be prime, it has to satisfy:

Whenever p | ab, we have p | a or p | b.

(or both, of course)

The traditional primes, of course, include 2 and do not include 1.

Your set "primes2" is simply all numbers of the form:
2^m, p, 2p
where m < 4, and p is a prime.

Your set "primes3" is simply all numbers of the form:
2^m 3^n, p, 2p, 3p
where m+n < 4, and p is a prime.

Yes, you missed "18" in your list "primes3".

 that means that there are at least an infinity of sequences that are as unpredictable as the primes.
We already knew that. Heck, as far as sequences go, the sequence of primes is extraordinarily simple. We can:
(1) Write down a list of properties that completely characterizes the primes.
(2) Write down an algorithm that, for any n, computes the n-th prime.
(3) Write down a formula for the n-th prime.

In some technical sense, (1) is extraordinarily rare, and I think (2) and (3) are even more rare.
 Sci Advisor P: 1,132 From an algorithm perspective identifying primes, or the nth prime isn't that simple, since Compositeness is in NP. We can't determine whether a number is composite or prime in polynomial time (that we/i know). To have an answer with 100% certainty you would need exponential time. I was wondering if adding numbers to the list of primes such as primes(2), and primes(3), or any other resulting from adding numbers to the primes, would produce a sequence whose elements we can identify in deterministic polynomial time, and at what point that change occurs. Such a sequence would say something about the certainty we can achieve in polynomial time, of whether a number is or isn't a prime. There are other problems in NP that seem to suffer the same problem of not being able to produce 100% certainty in polynomial time and which i have reduced to a sequence of numbers, similar but not equal to the primes.
P: 1,373

## Other "primes"

umm is Prime Factoring really in NP??? i was under the impression
for (int i=0;i<=n/2;i++) isfactor(n,i) is an polynomial time problem? The problem is that it is computational inefficient to do this for very large n.
but the algorithm is still polynomial isn't it?
 Sci Advisor P: 1,132 No, you're not seeing it correctly. Numbers are represented in binary. So your n is represented in log(n) bits, hence the input size is actually log(n). Letting m be the input size, then you actually require (2^m)/2 operations. I did see an article about an algorithm for identifying primes in polynomial time, but either it is not correct, or not yet accepted. In theory it could be, because Compositeness isn't NP complete, so it wouldn't say anything about whether P=NP, but i don't know that it is, and it is certainly not trivial to come up with such an algorithm.
 Sci Advisor P: 1,132 Actually, it is in P after all: http://en.wikipedia.org/wiki/AKS_primality_test Kind of unexpected for me, because in dealing with NP-complete problems i'd swear i was dealing with something very close to the primes. This means that a route to show that P=NP (unlikely) would be to reduce an NP-complete problem to the problem of "Primality testing". It seems like i'm always wrong no matter what though. :P One of these days i'll run out of wrong things to say. :P
HW Helper
P: 2,588
 Quote by neurocomp2003 umm is Prime Factoring really in NP??? i was under the impression for (int i=0;i<=n/2;i++) isfactor(n,i) is an polynomial time problem? The problem is that it is computational inefficient to do this for very large n. but the algorithm is still polynomial isn't it?
Does 'isfactor' take polynomial time?
 Sci Advisor P: 1,132 I'm guessing isFactor(n, i) would be equivalent to: n mod i == 0 which is linear or quadratic, i'm not sure which, depends on how multiplication/division is implemented on the given architecture, but i think it's linear, it's definitely polynomial though. The problem is that looping from 0 to n/2 actually takes exponential time. Neurocomp's algorithm isn't polynomial, but whether he knew it or not he is correct about Primality being in P.
 HW Helper Sci Advisor P: 2,588 Well, primality testing and prime factoring are two different tasks. His algorithm for primality testing looks like it would take polynomial time to me. You're saying that isfactor is linear, and it is run n/2 times, so we'll get something on the order of n²/2 as the running time, wouldn't we? I don't understand this: No, you're not seeing it correctly. Numbers are represented in binary. So your n is represented in log(n) bits, hence the input size is actually log(n). Letting m be the input size, then you actually require (2^m)/2 operations.
 Sci Advisor P: 1,132 Neurocomp's algorithm is the following: function isPrime(n){ for(i=0; i
 P: 534 primality testing is in P, just look at the sieve of eratosthenes. super easy.
PF Patron
Emeritus
P: 16,094
 Quote by ComputerGeek primality testing is in P, just look at the sieve of eratosthenes. super easy.
How do you figure?

For testing a k-bit number, you have to sieve over an array of size $2^k$. How do you plan on doing that in polynomial time?

Job: it's interesting to note that factoring is known to be a L(1/3, 1+e) algorithm (number field sieve) -- isn't that better than the best we know for NP problems? More food for thought.
P: 534
 Quote by Hurkyl How do you figure? For testing a k-bit number, you have to sieve over an array of size $2^k$. How do you plan on doing that in polynomial time?
Hmm.....

Your right, I guess I should let my Number Theory prof know he was wrong :-) ( I like it when Profs are wrong, it reminds me that they are not infalable... no matter how some of them like to pretend they are)

I did a search after I read your post and found the Dynamic Wheel Sieve which is O(log log n).
 P: 1,373 my bad..."prime factoring" i meant primality testing...reasons why ineed to sleep more
HW Helper
P: 1,996
 Quote by ComputerGeek Your right, I guess I should let my Number Theory prof know he was wrong :-) ( I like it when Profs are wrong, it reminds me that they are not infalable... no matter how some of them like to pretend they are)
or you misunderstood him.

 Quote by ComputerGeek I did a search after I read your post and found the Dynamic Wheel Sieve which is O(log log n).
What exactly are you claiming can be done in O(log log n) time? Certainly NOT test the primality of an integer n, this would give primality testing in the logarithmic time in the number of bits of n, much better than polynomial time.

Pritchard's wheel sieve gave something like O(N/log log N) time (measured in additions) to give a list of primes less than N. The may certainly have been improved upon (not to O(log log N) of course, or anything close). What sieve are you refering to?

edit- I googled "Dynamic Wheel Sieve" and one of the first links mentions pritchards with a running time of "O(n= log log n)", which may be what ComputerGeek found, a typo or ascii translation error.
 Sci Advisor P: 1,132 It's important to get our variables straight. If sieve(N) is a sieve that builds a list of all primes up to N, then lets define n such that n = log(N). So N is the input number, and n is the number of bits needed to represent N. The runtime of the sieve of Erastothenes is O(N log log N) which is equivalent to O(2^n log n). From what i saw online Pritchard's dynamic wheel sieve requires only O(N/log log N) which is equivalent to O(2^n/log n). AKS primality test, on the other hand, performs in O((log N)^12) which is equivalent to O(n^12). Apparently it can very likely be reduced to O(n^6). Of all algorithms only AKS is polynomial, but AKS isn't a sieve. It is used in determining whether a number is prime or not.
P: 534
 Quote by shmoe or you misunderstood him.
nope.

 What exactly are you claiming can be done in O(log log n) time? Certainly NOT test the primality of an integer n, this would give primality testing in the logarithmic time in the number of bits of n, much better than polynomial time. Pritchard's wheel sieve gave something like O(N/log log N) time (measured in additions) to give a list of primes less than N. The may certainly have been improved upon (not to O(log log N) of course, or anything close). What sieve are you refering to? edit- I googled "Dynamic Wheel Sieve" and one of the first links mentions pritchards with a running time of "O(n= log log n)", which may be what ComputerGeek found, a typo or ascii translation error.
your right, I did use that first link. If it is a typo, then that is what it is.
HW Helper