pbuk
Science Advisor
Homework Helper
Gold Member
- 4,970
- 3,219
Yes, and even if the algorithm used is ## O(n \log n) ## then in practice performance is much worse than ## k n \log n ## because there is a step change in ## k ## when the list no longer fits in processor cache, and again, dramitcally when the list no longer fits in main memory. There is also the memory usage to consider.Baluncore said:You only need to scan the list once if you test for a sign change of the first difference.
If you use sort you must do it twice, once for ascending and once for descending.
Even if "minimum source code" were a criterion (and I don't see that anywhere in the question) I don't think it would ever be the only criterion without any regard to performance.Baluncore said:But with Python the aim is to write the minimum source code, in a way that can be quickly understood.
But it will lose against an equally or even more concise implementation of a list traversal such as the one in #9.Baluncore said:Sorting twice will win under that measure.
No list length was specified in the op. Edit: I agree that premature optimization is to be avoided, but selecting an (at best) ## O(n \log n) ## time ## O(n) ## memory implementation when there is an at least equally trivial ## O(n) ## time ## O(1) ## memory one available is premature pessimization, and this is never good.Baluncore said:A monotonicity test is not the type of algorithm I would expect to find inside a tight loop, or running 50,000 times each day. If you only run the test once each day, on a 100 element list, then efficiency is irrelevant.