It depends on a few things:
- runtime (different browsers / runtimes use different sorting algorithms)
- how your input is ordered relative to the desired order
- if you use a custom comparator or not (also related to the previous point)
An application in which I am working on a serious performance degradation in a module that sorted a list of 35K + strings after the API endpoint it came across started offering data in sorted order. The time spent sorting the front-end ranged from 30 ms to 6 seconds (200x).
Sorting is performed using a custom comparator that prioritizes lines ending with a specific suffix. If neither or both lines end in a suffix, natural ordering is used. I profiled the module using the browser developer tools and found out that most of the time was spent with this comparison. The profile also showed that QuickSort is the main algorithm used by Array.sort() (at least in Chrome).

This was strange, since QuickSort was not affected by input ordering. According to Wikipedia:
[worst case] can occur if the pivot point turns out to be the smallest or largest element in the list or in some implementations (for example, the Lomuto partition scheme, as described above), when all elements are equal.
I became curious and compared several options of this kind. I used benchmark.js running on node on the command line. Both standards and the browser work on top of v8, so they must use the same sorting algorithm. The results were unexpected:
6 tests completed. Ordered array, sorted with a default comparator x 34.27 ops/sec ±1.07% (59 runs sampled) Ordered array, sorted with a custom comparator x 0.18 ops/sec ±2.81% (5 runs sampled) Ordered array, shuffled, sorted with a custom comparator x 38.37 ops/sec ±3.67% (51 runs sampled) Ordered array, shuffled, sorted with a default comparator x 29.20 ops/sec ±1.28% (51 runs sampled) Unordered array, sorted with a default comparator x 28.38 ops/sec ±1.28% (50 runs sampled) Unordered array, sorted with a custom comparator x 42.10 ops/sec ±1.32% (55 runs sampled)
These results show that performance degradation is associated with data distribution compared to the comparator. Here are some input specifications:
- suffix that comparator priority (
/Prod ) corresponds to approximately 20% of the lines - when lines are sorted alphabetically, those ending in
/Prod are more likely to spread relatively evenly throughout - sequences such as
ABC/Alpha , ABC/Beta , ABC/Prod are common.
This probably makes the algorithm more likely to select a rod, which is in the extreme case in its subsequence and, therefore, causes a very large number of comparisons between elements to be performed.
This only happens in Chrome 61. I tested both Firefox 52.3 and Safari 10.1, and the problem does not reproduce. I guess this is because a different sorting algorithm is used.