Asymptotic time complexity of Recursive function

Click For Summary
The discussion focuses on analyzing the asymptotic time complexity of a recursive function defined by specific conditions. The user has implemented the function and confirmed its correctness but struggles with understanding its time complexity. They initially suggest that the function operates in O(n) time due to the increasing size of ARG compared to constants N1 and C1. However, feedback indicates that the key factor is the number of times the function is called, leading to the need for a recurrence relation to accurately determine the complexity. Ultimately, the analysis reveals that the time complexity is not simply O(n) but requires a deeper mathematical understanding of the recursive calls.
GeorgeCostanz
Messages
29
Reaction score
0

Homework Statement



I've been asked to develop a recursive function and then analyze the asymptotic time complexity.

Code:
f(N) = 0, if N < N1

f(N1) = C1

f(N)= A1 + M1*f(M2*N/D1 - S1) Op M3*f(M4*N/D2 - S2), if N > N1


Homework Equations



We're to assume that:

s1 = s2 = 0

m2 = m4 = 1

d1 = d2 > 1

The Attempt at a Solution



the user enters N1 C1 A1 M1 M2 M3 M4 D1 D2 S1 S2 and then ARG

Code:
int Recursion_Plus(int ARG)
{

    if (ARG < n1)
    {
        return 0;
    }
    else if(ARG == n1)
    {
        return c1;
    }
    else if(ARG > n1 )
    {
        return a1 + m1 
        * 
       Recursion_Plus(m2*ARG/d1 - s1) 
       + 
       m3*Recursion_Plus(m4*ARG/d2 - s2);
    }

}

I've tested my recursive function against the instructor's program and it works exactly the same so I've moved on to my analysis, where I've hit a wall.

I'm struggling to wrap my brain around this so bear with me please.

My attempt at a partial solution:

a1 & m1 & m3 are insignificant because they're outside the recursive call

a1 + m1*___ = 1 unit of time

m1*___ = 1 unit of time

adding the 2 recursive calls together is 1 unit of time

m3*___ = 1 unit of time

This is where i get lost. From the instructions we're given, both recursive functions will be called using the same # every time, and every successive number that the recursive function calls will be smaller than the last because d1 = d2 > 1.

So the larger ARG is (in comparison to n1 & c1), the longer it takes to complete and the larger the result will be. So the algorithm takes O(n) time.

I'd appreciate it if anyone could let me kno if I'm on the right track or completely lost. Thanks
 
Physics news on Phys.org
GeorgeCostanz said:
So the larger ARG is (in comparison to n1 & c1), the longer it takes to complete and the larger the result will be. So the algorithm takes O(n) time.

Why not O(n2), or O(2n), or O(log[n])?

I'm afraid you are completely lost, although it happens that you have the right answer. Forget about the time it takes to perform calculations*, the important thing is the number of times the function is called.

If you have a mathematical background, let the number of times the function is called when ARG=n be K(n). Find a recurrence relation for K(n).

If your maths is less formal, here is a clue: when ARG ≤ N1 time ≈ k. When ARG ≤ N1 x D1 time ≈ k + k + k. When ARG ≤ N1 x D1 x D1 time ≈ k + 3k + 3k. When When ARG ≤ N1 x D12 time ≈ k + 7k + 7k...* you can do this becase ARG is an int so calculations are performed in constant time. If the problem involved e.g. arbitrary precision arithmetic this would not be the case.
 
Last edited:

Similar threads

  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 4 ·
Replies
4
Views
5K
Replies
1
Views
4K
  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 10 ·
Replies
10
Views
2K
  • · Replies 10 ·
Replies
10
Views
2K
  • · Replies 16 ·
Replies
16
Views
11K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 5 ·
Replies
5
Views
4K