- #1
Newtime
- 348
- 0
Note: I have never formally studied number theory so what I'm about to ask may be either completely trivial or completely meaningless. In either case, I don't know what to search for to find my answer since I don't know the terminology.
Is there a measure of how far from a prime a given natural number is? For example, I was thinking a good measure might be, given the prime decomposition, how many unique factors there are. So we might say [tex]100=2^2 \cdot 5^2 [/tex] is closer to being prime than say [tex] 150=5^2 \cdot 3 \cdot 2 [/tex]. However, the first problem I see here is that it might be the case that some number with a few hundred digits with 3 distinct prime factors is considered as un-prime as the number [tex]30=2\cdot 3 \cdot 5 [/tex]. The reason I think this may be considered a problem is that one would think that as the number grows in size (number of digits), the number of factors is more likely (in some sense) to have more prime factors in its decomposition whereas the number 30 stands no chance of having a large number of prime factors in its decomposition. So perhaps using this method one would need to normalize things and define some sort of index.
Has all this been done? If so, can someone direct me to an expository article on the subject? If not, I would assumed it's because this leads nowhere since it seems to be a natural question to ask. If it leads nowhere, why?
Is there a measure of how far from a prime a given natural number is? For example, I was thinking a good measure might be, given the prime decomposition, how many unique factors there are. So we might say [tex]100=2^2 \cdot 5^2 [/tex] is closer to being prime than say [tex] 150=5^2 \cdot 3 \cdot 2 [/tex]. However, the first problem I see here is that it might be the case that some number with a few hundred digits with 3 distinct prime factors is considered as un-prime as the number [tex]30=2\cdot 3 \cdot 5 [/tex]. The reason I think this may be considered a problem is that one would think that as the number grows in size (number of digits), the number of factors is more likely (in some sense) to have more prime factors in its decomposition whereas the number 30 stands no chance of having a large number of prime factors in its decomposition. So perhaps using this method one would need to normalize things and define some sort of index.
Has all this been done? If so, can someone direct me to an expository article on the subject? If not, I would assumed it's because this leads nowhere since it seems to be a natural question to ask. If it leads nowhere, why?