Comment by bermanoid
15 years ago
If we're talking about actual sorting algorithms that aren't pathologically inefficient (i.e. if we're not allowing algorithms that deliberately make negative progress or do random stuff instead of trying to find the solution) I'd think the "pessimal" (which, TBH, I'm not 100% clear on the definition of, whether we're talking worst case runtime, worst best-case runtime, average, etc.) deterministic solution would have to be essentially to enumerate the permutation group of N elements, checking each time whether the permutation creates a sorted list or not. There are factorial(N) members in this group, which is pretty bad worst-case running time for a sort.
There are many ways to enumerate permutations, and we could probably pick a terrible one there, too, especially if we use a really naive implementation without any memoization or anything like that we could probably waste all sorts of time.
Bonus points for freshly generating the whole list of permutations fresh for each isSorted(applyPermutation(i,list)) test.
I think to do much worse you'd probably have to start cheating, throwing in garbage calculations that are merely used to waste time, not steps that anyone would actually think of using to progress towards the solution (i.e. I could certainly dream up a nasty brute-force problem to solve for the index integers used in the inner loop, but that's in the category of deliberate time wasting) - the nice thing about this solution is that it's 100% conceivable that someone that wasn't thinking straight could come up with it and implement it just like this. Hell, a poorly written Prolog would probably come close to implementing this algorithm if the sorting problem was specified in the right (wrong?) way...
There's also always SlowSort: http://c2.com/cgi/wiki?SlowSort, which works by finding the maximum element, then recursively sorting the rest of the list. That wouldn't be so terrible if it weren't for the twist: SlowSort finds each maximum element is by splitting the list into half, sorting both lists (using SlowSort, of course) to read off the maxima, and then picking the bigger one. I think this still runs faster than N!, but its runtime is guaranteed, whereas when generating permutations you might accidentally hit the right one in the first step (could always work around that by calculating all the lists first and then checking them, but again, that seems like cheating).
No comments yet
Contribute on Hacker News ↗