Average complexity -- worst case of an algorithm

In summary: There is no (single) best algorithm. "The" best algorithm depends on how often and for what numbers you run it. If you run it only once per year and always for numbers where c is the median, then your algorithm is better than one that is more complicated but makes only 5 comparisons for other input numbers.
  • #1
TheMathNoob
189
4

Homework Statement


a Write an algorithm to find the median of three distinct integers a,b,c
b Describe D, the set of inputs for your algorithm, in light of the discussion in Section 1.4.3 following Example 1.9. This section discusses Average complexity for an algorithm which involves an understanding of random variables. Exercise 1.9 describes an array in which an algorithm must be written to find a certain value in the array. D describe the set of inputs. The book says that in this case the inputs can be categorized according to where in the array the value appears.
c How many comparisons does your algorithm do in the worst case? on the average?
d How many comparsions are necessary in the worst case to find the median of tree numbers? Justify your answer

Homework Equations



Average complexity=A(n) =∑Pr(I)t(I) the iterators for sigma is I∈Dn

t(I) is the number of basic operations performed by input I and Pr(I) represents the prob that I happens.
[/B]

The Attempt at a Solution


a)
int a;
int b;
int c;
int median;
if(a>b and a<c or a<b and a>c)
median =a;
else if(b>a and b<c or b<a and b>c)
median=b;
else
median=c;

b) Here I tried to categorize the events as the previous exercise.
Ii∈Dn

I1= the event in which the first comparison is true
I2= the event in which the second comparison is true and the previous ones are false (by logic)
I3= the event in which the third comparison is true and the previous ones are false (by logic)
I4= the event in which the forth comparison is true the previous ones are false

3) the compiler makes 4 comparisons in the worst case

Average=A(n)=∑Pr(Ii)t(i) the iterator for sigma is 1≤i≤4

Pr(Ii) is equally likely to occur in any case because in total there are 6 possible outcomes like a<b<c, b<a<c etc and just one outcome will make the Ii happens.

so

Pr(Ii)= 1/6 for 1≤i≤4

t(i)=i because if the compiler is reading the ith comparison then it should have read as well the previous comparisons.

A(n)=1/6*1+1/6*2+1/6*3+1/6*4=10/6=1.66

Note: This result will vary based on the compiler because compilers can take readings in different orders. In this case we are assuming that once the compiler sees the first comparison to be true then it doesn't look the second one.

Is all that right?

d) I did not get this part because to me, it looks the same as part c.
 
Last edited:
Physics news on Phys.org
  • #2
There are 6 possible outcomes, but you just have 4 cases. If every case would have probability 1/6, then with probability 2/6 you don't have any case? I think you are missing something here.
TheMathNoob said:
3) the compiler makes 4 comparisons in the worst case
"a>b and a<c" are two comparisons. The compiler won't make any comparison, but running the program will. A good compiler will probably write the code in such a way that "a<c" is not tested any more if "a>b" fails (because "X and Y" cannot be true if X is false). For "or", it works the other way round which you considered already.

You can reduce the number of necessary comparisons if you rewrite your code, but I don't know if optimizing it is part of the task.
 
  • #3
mfb said:
There are 6 possible outcomes, but you just have 4 cases. If every case would have probability 1/6, then with probability 2/6 you don't have any case? I think you are missing something here.
"a>b and a<c" are two comparisons. The compiler won't make any comparison, but running the program will. A good compiler will probably write the code in such a way that "a<c" is not tested any more if "a>b" fails (because "X and Y" cannot be true if X is false). For "or", it works the other way round which you considered already.

You can reduce the number of necessary comparisons if you rewrite your code, but I don't know if optimizing it is part of the task.
Hi mfb, I did just 4 comparisons instead of doing six. As you can see in the block of the last else statement , it is logical to put two more unnecessary comparisons. How can you optimize that code?
 
  • #4
TheMathNoob said:
Hi mfb, I did just 4 comparisons instead of doing six.
Sure, but you have more cases that can happen. All four (actually eight) comparisons can fail, that is another case (and it is more frequent than the other 4 you considered so far).
TheMathNoob said:
How can you optimize that code?
It is your homework problem. One possible start is "if(a>b) {".
 
  • #5
mmmm I do not understand. Once the four comparisons fail, the compiler proceeds immediately to set median =c which is not a comparison. The average complexity in this case is just about comparisons.
mfb said:
Sure, but you have more cases that can happen. All four (actually eight) comparisons can fail, that is another case (and it is more frequent than the other 4 you considered so far).
It is your homework problem. One possible start is "if(a>b) {".
 
  • #6
TheMathNoob said:
Once the four comparisons fail, the compiler proceeds immediately to set median =c which is not a comparison.
Sure, but the computer had to do all eight comparisons to get to this case.
Well, at least 6 even with optimization of the "and" evaluation. But certainly comparisons have been done if you set the median to c.
 
  • #7
mfb said:
Sure, but the computer had to do all eight comparisons to get to this case.
Well, at least 6 even with optimization of the "and" evaluation. But certainly comparisons have been done if you set the median to c.
Ok, so what I have to do is to reconsider all 8 comparisons? and then try to find a way to optimize my code.
 
  • #8
For numbers where c is the median, you need 6-8 comparisons right now.

Your algorithm is not wrong, and if that algorithm is not expected to run millions of times I would certainly write it that way because it is easy to read. You can reduce the number of comparisons, if you like, I don't think you have to. It would make the comparison between (c) and (d) nicer, however.
 
  • #9
mfb said:
For numbers where c is the median, you need 6-8 comparisons right now.

Your algorithm is not wrong, and if that algorithm is not expected to run millions of times I would certainly write it that way because it is easy to read. You can reduce the number of comparisons, if you like, I don't think you have to. It would make the comparison between (c) and (d) nicer, however.
Thank you so much for your help!
 

What is the concept of average complexity in algorithm analysis?

The average complexity of an algorithm refers to the expected time or space required for the algorithm to run on a set of inputs, taking into account all possible input combinations and their probabilities. It is a measure of the algorithm's efficiency on average.

What is the worst-case complexity of an algorithm?

The worst-case complexity of an algorithm is the maximum time or space required for the algorithm to run on any input of a given size. It represents the algorithm's performance in the most unfavorable scenario.

How does the average complexity differ from the worst-case complexity?

The average complexity takes into account all possible inputs and their probabilities, while the worst-case complexity only considers the maximum time or space required for a single input. Therefore, the average complexity is a more realistic measure of an algorithm's efficiency.

Why is it important to analyze the average complexity of an algorithm?

Analyzing the average complexity allows us to predict the performance of an algorithm on a variety of inputs. It also helps in comparing different algorithms and choosing the most efficient one for a given problem.

What are some common techniques for calculating the average complexity of an algorithm?

Some common techniques for calculating the average complexity include the use of probability distributions, recurrence relations, and the concept of amortized analysis. Additionally, empirical analysis or testing can also be used to estimate the average complexity of an algorithm.

Similar threads

  • Engineering and Comp Sci Homework Help
Replies
1
Views
1K
  • Engineering and Comp Sci Homework Help
Replies
4
Views
1K
  • Engineering and Comp Sci Homework Help
Replies
3
Views
948
  • Engineering and Comp Sci Homework Help
Replies
1
Views
913
  • Engineering and Comp Sci Homework Help
Replies
1
Views
2K
  • Engineering and Comp Sci Homework Help
Replies
1
Views
2K
  • Engineering and Comp Sci Homework Help
Replies
33
Views
3K
  • Engineering and Comp Sci Homework Help
Replies
12
Views
4K
  • Engineering and Comp Sci Homework Help
Replies
17
Views
5K
  • Engineering and Comp Sci Homework Help
Replies
1
Views
2K
Back
Top