Asymptotic time complexity of Recursive function

In summary, the task at hand is to develop a recursive function and analyze its asymptotic time complexity. The function, denoted as f(N), has a base case where it returns 0 if N is less than N1, and returns C1 if N is equal to N1. For N greater than N1, the function performs a series of calculations involving A1, M1, M2, M3, M4, D1, D2, S1, and S2. The time complexity of this function is O(n) due to the number of times the function is called, which can be expressed as a recurrence relation K(n).
  • #1

Homework Statement



I've been asked to develop a recursive function and then analyze the asymptotic time complexity.

Code:
f(N) = 0, if N < N1

f(N1) = C1

f(N)= A1 + M1*f(M2*N/D1 - S1) Op M3*f(M4*N/D2 - S2), if N > N1


Homework Equations



We're to assume that:

s1 = s2 = 0

m2 = m4 = 1

d1 = d2 > 1

The Attempt at a Solution



the user enters N1 C1 A1 M1 M2 M3 M4 D1 D2 S1 S2 and then ARG

Code:
int Recursion_Plus(int ARG)
{

    if (ARG < n1)
    {
        return 0;
    }
    else if(ARG == n1)
    {
        return c1;
    }
    else if(ARG > n1 )
    {
        return a1 + m1 
        * 
       Recursion_Plus(m2*ARG/d1 - s1) 
       + 
       m3*Recursion_Plus(m4*ARG/d2 - s2);
    }

}

I've tested my recursive function against the instructor's program and it works exactly the same so I've moved on to my analysis, where I've hit a wall.

I'm struggling to wrap my brain around this so bear with me please.

My attempt at a partial solution:

a1 & m1 & m3 are insignificant because they're outside the recursive call

a1 + m1*___ = 1 unit of time

m1*___ = 1 unit of time

adding the 2 recursive calls together is 1 unit of time

m3*___ = 1 unit of time

This is where i get lost. From the instructions we're given, both recursive functions will be called using the same # every time, and every successive number that the recursive function calls will be smaller than the last because d1 = d2 > 1.

So the larger ARG is (in comparison to n1 & c1), the longer it takes to complete and the larger the result will be. So the algorithm takes O(n) time.

I'd appreciate it if anyone could let me kno if I'm on the right track or completely lost. Thanks
 
Physics news on Phys.org
  • #2
GeorgeCostanz said:
So the larger ARG is (in comparison to n1 & c1), the longer it takes to complete and the larger the result will be. So the algorithm takes O(n) time.

Why not O(n2), or O(2n), or O(log[n])?

I'm afraid you are completely lost, although it happens that you have the right answer. Forget about the time it takes to perform calculations*, the important thing is the number of times the function is called.

If you have a mathematical background, let the number of times the function is called when ARG=n be K(n). Find a recurrence relation for K(n).

If your maths is less formal, here is a clue: when ARG ≤ N1 time ≈ k. When ARG ≤ N1 x D1 time ≈ k + k + k. When ARG ≤ N1 x D1 x D1 time ≈ k + 3k + 3k. When When ARG ≤ N1 x D12 time ≈ k + 7k + 7k...


* you can do this becase ARG is an int so calculations are performed in constant time. If the problem involved e.g. arbitrary precision arithmetic this would not be the case.
 
Last edited:

Suggested for: Asymptotic time complexity of Recursive function

Replies
2
Views
852
Replies
7
Views
774
Replies
2
Views
1K
Replies
3
Views
754
Replies
2
Views
1K
Replies
3
Views
587
Replies
10
Views
979
Replies
6
Views
812
Back
Top