Will P Ever Equal NP as Technology Advances?

  • Thread starter Thread starter faddishworm
  • Start date Start date
faddishworm
Messages
9
Reaction score
0
Hey All,

I was wondering if anyone had ever done any work on the P v NP problem that involved expressing P = NP as a limit

I am sure there are 1001 "proofs" submitted every day to official boards at math institutes so I thought I would come here to get a handle on the state of the problem of P and NP.

My reasoning tells me that at some point in time P will equal NP because our ability to solve problems right now is limited. If you imagine in thousands or millions of years, if we are still around surely given the rate at which we solve problems and invent technology we would be able to solve anything as quickly as we can verify it now. Bare with me I know this sounds like hocus pocus.

The RSA used to pay money if members of the public could factorize huge numbers. http://en.wikipedia.org/wiki/RSA_Factoring_Challenge

They actually had to make a few payouts because by harnessing the CPUs of thousands of machines they could compute the factors in about 6 months, despite it being computationally infeasible.

Given the rate at which CPUs get more powerful and given that they haven't been around very long is it feasible to say that at some point in time we will be able to factorize huge numbers as quickly as we can verify them?

Is it fair to say that at some point in time P = NP - is there any school of thought regarding this?
 
Mathematics news on Phys.org
P v NP refers to whether there exists an algorithm which grows at polynomial rate. It has nothing to do with CPU power. If P = NP in a million years time in the future, that means that P = NP today.
 
How can an algorithm grow?
 
Wikipedia Analysis of Algorithms. Specifically Order of growth.

Another thing to point out. We use the concept of time complexity. How much "time" an algorithm uses is determined by the number of steps, not the number of seconds it takes. So even if CPUs get faster, it has no bearing on the order.
 
Last edited:
Do you mean the time it takes an algorithm to compute something grows? or the algo itself?
 
If n is the length of the string used as input, we estimate the number of elementary operations need to solve the problem with the algorithm. Then we get a (asymptotical) function in n. It's how fast that function grow as n approaches infinity.
 
Insights auto threads is broken atm, so I'm manually creating these for new Insight articles. In Dirac’s Principles of Quantum Mechanics published in 1930 he introduced a “convenient notation” he referred to as a “delta function” which he treated as a continuum analog to the discrete Kronecker delta. The Kronecker delta is simply the indexed components of the identity operator in matrix algebra Source: https://www.physicsforums.com/insights/what-exactly-is-diracs-delta-function/ by...
Fermat's Last Theorem has long been one of the most famous mathematical problems, and is now one of the most famous theorems. It simply states that the equation $$ a^n+b^n=c^n $$ has no solutions with positive integers if ##n>2.## It was named after Pierre de Fermat (1607-1665). The problem itself stems from the book Arithmetica by Diophantus of Alexandria. It gained popularity because Fermat noted in his copy "Cubum autem in duos cubos, aut quadratoquadratum in duos quadratoquadratos, et...
Thread 'Imaginary Pythagorus'
I posted this in the Lame Math thread, but it's got me thinking. Is there any validity to this? Or is it really just a mathematical trick? Naively, I see that i2 + plus 12 does equal zero2. But does this have a meaning? I know one can treat the imaginary number line as just another axis like the reals, but does that mean this does represent a triangle in the complex plane with a hypotenuse of length zero? Ibix offered a rendering of the diagram using what I assume is matrix* notation...
Back
Top