What is the lowest known complexity for multiplication?

CRGreathouse
Science Advisor
Homework Helper
Messages
2,832
Reaction score
0
'Schoolbook' multiplication takes O(n^2) operations. Karatsuba multiplication takes O(n^{\lg 3})\approx O(n^{1.59}) operations. The best method I know of (which is only practical for very large numbers) is O(n \log n \log\log n).

Is there a known nontrivial lower bound on the complexity of multiplication? I read a paper that discussed the time-(silicon) area tradeoff on multiplication, but I wasn't able to get a useful bound from that (caring only about the time and not knowing how to 'remove' the area):
The Area-Time Complexity of Binary Multiplication

Can anyone help me out here?
 
Mathematics news on Phys.org
Well Donald Knuth's The Art of Computer Programming, Volume 2: Seminumerical Algorithms has a chapter entitled How fast can we multiply and if I remember correctly he comments on the lack of a lower bound for multiplication algorithms. However, that was written over 35 years ago, so you'd think that there'd have been some progress since then. I tried searching for papers with references to that chapter of Knuth's book, but didn't find very many of them.
 
Yeah, I've been searching but haven't found anything. I hoped the new Algorithms by Dasgupta, Papadimitriou, and Vazirani might have something, but no such luck. I don't have a copy of Knuth's book or that's where I would have started (old as it is).

I did find something online, but I don't think I trust it: "Surprisingly, there are several better [multiplication] algorithms, taking as few as n\log n operations (although whether it's possible to solve this in O(n) operations is a major open problem)."
http://www.cs.princeton.edu/courses/archive/spr06/cos522/lec1.pdf
What do you think? On one hand it supports what you remember reading; on the other, it's written for a basic audience and (seems to) misquote the complexity of (what I presume to be) S-S multiplication.
 
No, I don't believe the n log n claim either. I did find http://www.csc.liv.ac.uk/~ped/teachadmin/algor/complex.html which reports it as a conjecture, but even then I'm doubtful - I would have thought that an n log n algorithm would be straightforward enough to have been found by now. It looks like S-S is still the optimum. The lack of a good lower bound isn't particularly surprising, proving such things is never easy.
 
Insights auto threads is broken atm, so I'm manually creating these for new Insight articles. In Dirac’s Principles of Quantum Mechanics published in 1930 he introduced a “convenient notation” he referred to as a “delta function” which he treated as a continuum analog to the discrete Kronecker delta. The Kronecker delta is simply the indexed components of the identity operator in matrix algebra Source: https://www.physicsforums.com/insights/what-exactly-is-diracs-delta-function/ by...
Fermat's Last Theorem has long been one of the most famous mathematical problems, and is now one of the most famous theorems. It simply states that the equation $$ a^n+b^n=c^n $$ has no solutions with positive integers if ##n>2.## It was named after Pierre de Fermat (1607-1665). The problem itself stems from the book Arithmetica by Diophantus of Alexandria. It gained popularity because Fermat noted in his copy "Cubum autem in duos cubos, aut quadratoquadratum in duos quadratoquadratos, et...
Thread 'Imaginary Pythagorus'
I posted this in the Lame Math thread, but it's got me thinking. Is there any validity to this? Or is it really just a mathematical trick? Naively, I see that i2 + plus 12 does equal zero2. But does this have a meaning? I know one can treat the imaginary number line as just another axis like the reals, but does that mean this does represent a triangle in the complex plane with a hypotenuse of length zero? Ibix offered a rendering of the diagram using what I assume is matrix* notation...

Similar threads

Back
Top