- #1
~Death~
- 45
- 0
like for any algorithm?
On wikipedia it lists for multiplication it's O(n^2)
but some of them have decimals like O(5/2n^2.2342) or something so how would you determine that?
Just use a computer and make a graph of input in bytes vs time in seconds and fit a curve to it?
Like I understand for eg the problem of determining if there's exists a subset of integers in another subset of integers such that they add to 0:
If the algorithm was to check each subset, if there's n integers then there's 2^n-1 subsets to check, but then youd have to add each one -so that would count as a step?
so there's 2^n+n-1 steps for an input of n ..does that mean its on the order of O(2^n) ?
On wikipedia it lists for multiplication it's O(n^2)
but some of them have decimals like O(5/2n^2.2342) or something so how would you determine that?
Just use a computer and make a graph of input in bytes vs time in seconds and fit a curve to it?
Like I understand for eg the problem of determining if there's exists a subset of integers in another subset of integers such that they add to 0:
If the algorithm was to check each subset, if there's n integers then there's 2^n-1 subsets to check, but then youd have to add each one -so that would count as a step?
so there's 2^n+n-1 steps for an input of n ..does that mean its on the order of O(2^n) ?
Last edited: