# How is computational complexity determined?

1. Jul 13, 2009

### ~Death~

like for any algorithm?

On wikipedia it lists for multiplication it's O(n^2)

but some of them have decimals like O(5/2n^2.2342) or something so how would you determine that?

Just use a computer and make a graph of input in bytes vs time in seconds and fit a curve to it?

Like I understand for eg the problem of determining if theres exists a subset of integers in another subset of integers such that they add to 0:

If the algorithm was to check each subset, if theres n integers then theres 2^n-1 subsets to check, but then youd have to add each one -so that would count as a step?

so theres 2^n+n-1 steps for an input of n ..does that mean its on the order of O(2^n) ?

Last edited: Jul 13, 2009
2. Jul 13, 2009

### Dragonfall

You might want to check out http://en.wikipedia.org/wiki/Big-o_notation first.

So yes, $2^n+n-1\in\mathcal{O}(2^n)$. To determine this, you don't run the algorithm; you have to analyze (usually) the number of bit-operations it takes for your algorithm to terminate, given an input of n bits. Usually this number jumps around depending on the input, so we usually just look at the WORST case or the AVERAGE case.