Asymptotic time complexity of Recursive function

In summary, the task at hand is to develop a recursive function and analyze its asymptotic time complexity. The function, denoted as f(N), has a base case where it returns 0 if N is less than N1, and returns C1 if N is equal to N1. For N greater than N1, the function performs a series of calculations involving A1, M1, M2, M3, M4, D1, D2, S1, and S2. The time complexity of this function is O(n) due to the number of times the function is called, which can be expressed as a recurrence relation K(n).
  • #1
GeorgeCostanz
31
0

Homework Statement



I've been asked to develop a recursive function and then analyze the asymptotic time complexity.

Code:
f(N) = 0, if N < N1

f(N1) = C1

f(N)= A1 + M1*f(M2*N/D1 - S1) Op M3*f(M4*N/D2 - S2), if N > N1


Homework Equations



We're to assume that:

s1 = s2 = 0

m2 = m4 = 1

d1 = d2 > 1

The Attempt at a Solution



the user enters N1 C1 A1 M1 M2 M3 M4 D1 D2 S1 S2 and then ARG

Code:
int Recursion_Plus(int ARG)
{

    if (ARG < n1)
    {
        return 0;
    }
    else if(ARG == n1)
    {
        return c1;
    }
    else if(ARG > n1 )
    {
        return a1 + m1 
        * 
       Recursion_Plus(m2*ARG/d1 - s1) 
       + 
       m3*Recursion_Plus(m4*ARG/d2 - s2);
    }

}

I've tested my recursive function against the instructor's program and it works exactly the same so I've moved on to my analysis, where I've hit a wall.

I'm struggling to wrap my brain around this so bear with me please.

My attempt at a partial solution:

a1 & m1 & m3 are insignificant because they're outside the recursive call

a1 + m1*___ = 1 unit of time

m1*___ = 1 unit of time

adding the 2 recursive calls together is 1 unit of time

m3*___ = 1 unit of time

This is where i get lost. From the instructions we're given, both recursive functions will be called using the same # every time, and every successive number that the recursive function calls will be smaller than the last because d1 = d2 > 1.

So the larger ARG is (in comparison to n1 & c1), the longer it takes to complete and the larger the result will be. So the algorithm takes O(n) time.

I'd appreciate it if anyone could let me kno if I'm on the right track or completely lost. Thanks
 
Physics news on Phys.org
  • #2
GeorgeCostanz said:
So the larger ARG is (in comparison to n1 & c1), the longer it takes to complete and the larger the result will be. So the algorithm takes O(n) time.

Why not O(n2), or O(2n), or O(log[n])?

I'm afraid you are completely lost, although it happens that you have the right answer. Forget about the time it takes to perform calculations*, the important thing is the number of times the function is called.

If you have a mathematical background, let the number of times the function is called when ARG=n be K(n). Find a recurrence relation for K(n).

If your maths is less formal, here is a clue: when ARG ≤ N1 time ≈ k. When ARG ≤ N1 x D1 time ≈ k + k + k. When ARG ≤ N1 x D1 x D1 time ≈ k + 3k + 3k. When When ARG ≤ N1 x D12 time ≈ k + 7k + 7k...* you can do this becase ARG is an int so calculations are performed in constant time. If the problem involved e.g. arbitrary precision arithmetic this would not be the case.
 
Last edited:

1. What is asymptotic time complexity?

Asymptotic time complexity refers to the rate at which the runtime of a function or algorithm increases as the input size grows towards infinity. It is often denoted using the "Big O" notation.

2. How is asymptotic time complexity of a recursive function calculated?

The asymptotic time complexity of a recursive function is calculated by analyzing the number of recursive calls made and the amount of work done within each call. This can be expressed as a function of the input size, and then simplified using Big O notation.

3. What is the difference between time complexity and asymptotic time complexity?

Time complexity refers to the exact number of operations required to solve a problem, while asymptotic time complexity describes the behavior of the time complexity as the input size grows towards infinity. Time complexity is a more precise measure, while asymptotic time complexity provides a general understanding of the efficiency of an algorithm.

4. Why is it important to consider the asymptotic time complexity of recursive functions?

Recursive functions can easily lead to exponential time complexity, meaning the runtime grows very quickly as the input size increases. Understanding the asymptotic time complexity allows us to predict the efficiency of a recursive algorithm and potentially optimize it to improve performance.

5. Can two recursive functions have the same asymptotic time complexity?

Yes, two recursive functions can have the same asymptotic time complexity. This means that they have the same overall behavior in terms of runtime as the input size increases towards infinity, but they may differ in the exact number of operations required for a specific input size.

Similar threads

  • Engineering and Comp Sci Homework Help
Replies
3
Views
934
  • Engineering and Comp Sci Homework Help
Replies
4
Views
3K
  • Engineering and Comp Sci Homework Help
Replies
1
Views
3K
  • Engineering and Comp Sci Homework Help
Replies
7
Views
2K
Replies
5
Views
354
  • Engineering and Comp Sci Homework Help
Replies
10
Views
1K
  • Engineering and Comp Sci Homework Help
Replies
10
Views
2K
  • Linear and Abstract Algebra
Replies
1
Views
910
  • Engineering and Comp Sci Homework Help
Replies
1
Views
2K
  • Engineering and Comp Sci Homework Help
Replies
5
Views
3K
Back
Top