Complexity of divide and conquer algorithm?

Click For Summary
SUMMARY

The discussion centers on the complexity of divide and conquer algorithms, specifically addressing the misconception that partitioning always reduces computational time. It is established that partitioning problems, such as matrix multiplication, does not necessarily yield a performance benefit and can introduce additional overhead. The example of multiplying square matrices illustrates that while conventional methods operate at O(n^3), partitioning does not improve this complexity due to the costs associated with reconstruction and bookkeeping. The conclusion emphasizes that divide and conquer strategies may not always be advantageous and can sometimes increase overall computational time.

PREREQUISITES
  • Understanding of algorithm complexity notation (e.g., O(n^k))
  • Familiarity with divide and conquer algorithm principles
  • Knowledge of matrix multiplication algorithms
  • Basic concepts of recursion in algorithms
NEXT STEPS
  • Research the implications of partitioning in divide and conquer algorithms
  • Study advanced matrix multiplication techniques, such as Strassen's algorithm
  • Explore the trade-offs of recursion versus iteration in algorithm design
  • Learn about the merge sort algorithm and its efficiency compared to other sorting methods
USEFUL FOR

Computer scientists, software engineers, and algorithm designers looking to deepen their understanding of algorithm efficiency and the practical implications of divide and conquer strategies.

homomorphism
Messages
19
Reaction score
0
Let's say I have some algorithm with complexity O(n^k) for some constant k. and let's say it runs in some time T. Now, I want to implement a divide and conquer approach for this algorithm, by dividing the problem in half each recursion. So basically, the first time i run my algorithm it will run in time T, the second time it will be O(\frac{n^k}{2^k}) + O(\frac{n^k}{2^k}) [since I'm running my algorithm on both halves] so the time will be \frac{T}{2^k} + \frac{T}{2^k}. and then when i run my algorithm again it will be O(\frac{n^k}{4^k}) * 4. This means the time will be 4 * \frac{T}{4^k}.

So basically, for a worst case scenario, with infinite recursions, I should have some sum,
\sum_{i=0}^{\infty} \frac{T}{2^i^{(k-1)}}

Where T is a constant. does this look about right?
 
Technology news on Phys.org
bump...
 
homomorphism said:
does this look about right?

You are doing several things wrong here. You are assuming that the problem at hand can be partitioned. This is not always the case. You are assuming that partitioning buys something, and this also is not always the case. Even if some problem can be partitioned and the partitioned problem is easier to solve, there is typically a cost involved in reconstructing the solution in terms of the original problem. Finally, you are assuming infinite recursion.

Consider a couple of examples: Multiplying a pair of square matrices and sorting a list.

Using conventional matrix multiplication algorithm to compute the product of a pair of NxN matrices requires N3 multiplications and N3-N2 additions; it is an O(n3) operation. Suppose the matrices are of size 2Nx2N. Each matrix can be split into four NxN submatrices. The product of the original matrices can be computed using these submatrices. Since 2x2 matrix multiplication requires 8 multiplications and 4 additions, the partitioned problem involves 8 NxN matrix multiplications and 4 NxN matrix additions, for a total of 8*N3=(2N)3 scalar multiplications and 8*(N3-N2)+4*N2=(2N)3-(2N)2 additions. Partitioning doesn't buy a thing here!

There exists a non-conventional technique for multiplying 2x2 matrices that requires but seven multiplications (but at the cost of extra bookkeeping and a lot of extra additions). Partitioning can be of benefit here. Infinite recursion is not possible; A 1x1 matrix cannot be split into smaller parts. For large matrices, this technique can be used to reduce the order from N3 to Nlog27.

Partitioning is the key tactic needed to make sorting an O(n*log(n)) problem rather than O(n2). However, the partitioning can only be carried so far (you can't split a list of one into parts) and the resulting sorted lists need to be merged.
 
yeah, i definitely understand what you're saying. But for example, your matrix multiplication, let's say it runs in some time T. and then you partition it. the time it takes for the computation would be less right? (assuming it can be partitioned, you have a large enough matrix, etc), which is kind of what my formula above is trying to say (though you can modify the bounds of the summation depending on the situation). is this right?
 
No. Partitioning the standard matrix multiplication algorithm doesn't buy a thing (the total number of multiplications and additions doesn't change.) In fact, it hurts because it takes time to perform the partitioning, extra time to do the extra bookkeeping the comes with partitioning, and extra time to reconstruct the solution from the partitioned products. Divide and conquer does not always work. Sometimes using divide-and-conquer costs more than just solving the original problem. Matrix multiplication is one case where this occurs. There are many, many others.
 

Similar threads

Replies
9
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 15 ·
Replies
15
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
Replies
4
Views
2K
Replies
31
Views
3K
Replies
25
Views
4K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 4 ·
Replies
4
Views
6K
  • · Replies 13 ·
Replies
13
Views
3K