Evolving Prime Number Algorithms: Can Computers Duplicate Human Programming?

In summary, there are multiple algorithms for finding prime numbers, with the simplest being dividing all numbers less than the prime number. Another approach is dividing by numbers less than or equal to half of the number. However, it is more efficient to only check possible primes up to the square root of the number. This approach has been developed through the evolution of programming in the mind of the programmer, but can it be duplicated by a computer? This concept seems to involve aspects of genetic programming and metaprogramming, and there has been some research in this area. Some relevant fields to explore are meta-learning, meta-genetic programming, and machine learning in general.
  • #1
shivakumar06
69
0
there are many prime number algorithm the simplest being dividing all the number less than the prime number. then comes division of number less than or equal half of the number to prime number with the prime number finally the division by all the smaller prime number than the given number with the required number. this is just the the way a program has evolved in the mind of the programmer. can this evolution of programming duplicated by the computer. if only the objective of programme given.(may be in haskell programming i am not sure) if yes by which programming language and the i need the key words that can be used as well
 
Last edited:
Computer science news on Phys.org
  • #2
there are many prime number algorithm the simplest being dividing all the number less than the prime number. then comes division of number less than or equal half of the number to prime number

Actually, you only need to check possible primes up to the square root of the number you're trying to factor. Going up to half of the number you're trying to factor is overkill.

this is just the the way a program has evolved in the mind of the programmer. can this evolution of programming duplicated by the computer. if only the objective of programme given.(may be in haskell programming i am not sure) if yes by which programming language and the i need the key words that can be used as well

What you're talking about sounds like a cross between genetic programming and metaprogramming. There's been some work done in this area, such as meta-learning for machine learning algorithms, meta-genetic programming, and to a certain extent, machine learning in general. I recommend looking into these fields to start and seeing where that takes you.
 

1. What are prime numbers and why are they important in computing?

Prime numbers are positive integers that are only divisible by 1 and themselves. They are important in computing because they are the building blocks of all positive integers, and many mathematical algorithms and encryption methods rely on prime numbers for their complexity and security.

2. What is the difference between a human programmed prime number algorithm and a computer generated one?

A human programmed prime number algorithm is created by a person using their own logic and understanding of mathematics, while a computer generated one is created by a machine using predefined rules and instructions. Human programmed algorithms may be more creative and adaptable, while computer generated ones may be more efficient and precise.

3. Can computers really duplicate human programming when it comes to prime number algorithms?

In theory, yes, computers can duplicate human programming when it comes to prime number algorithms. However, it is unlikely that a computer could replicate the exact same thought process and decision making as a human programmer, so the resulting algorithm may differ in some ways.

4. How do prime number algorithms evolve over time?

Prime number algorithms can evolve over time through a process called optimization. This involves testing and tweaking the algorithm to improve its efficiency and accuracy. As technology advances, new algorithms may also be developed to take advantage of new computing capabilities.

5. What are the potential implications of computers being able to duplicate human programming in prime number algorithms?

If computers are able to duplicate human programming in prime number algorithms, it could lead to faster and more efficient methods for solving complex mathematical problems and for securing sensitive data. However, it could also raise ethical concerns about the role of human creativity and decision making in the field of computing.

Similar threads

  • Programming and Computer Science
Replies
22
Views
779
  • Programming and Computer Science
Replies
14
Views
2K
  • Computing and Technology
2
Replies
44
Views
3K
Replies
5
Views
2K
  • Engineering and Comp Sci Homework Help
Replies
33
Views
3K
  • General Math
Replies
13
Views
1K
Replies
10
Views
2K
Replies
29
Views
3K
  • Programming and Computer Science
Replies
2
Views
2K
  • Computing and Technology
Replies
11
Views
2K
Back
Top