Haskell sortBy complexity

I came up with an idea to solve another SO question and I hope for some help with determining the complexity of the function, since I am not knowledgeable about this. Would I be right in guessing that "unsorting" each of the pieces would be O (n * log 2) ? And then what would be the complexity of the sortBy function, which compares the last of one piece with the head of another? My guess is that this function will compare pairs and does not require searching for the same block order in terms of a common list. Also, can Haskell offer varying complexity due to lazy optimization of a common function? Thanks in advance!

 import Data.List.Split (chunksOf) import Data.List (sortBy) rearrange :: [Int] -> [Int] rearrange = concat . sortBy (\ab -> compare (last a) (head b)) . map (sortBy (\ab -> compare ba)) . chunksOf 2 
+4
source share
2 answers

Let's look at the individual parts (let n be the length of the list argument):

  • chunksOf 2 - O(n) , which leads to a list of length (n+1) `quot` 2 .
  • map (sortBy ...) : Since all lists that are passed to sortBy ... have a length of <= 2 , each of these views is O(1) , and thus all map is O(n) .
  • sortBy (\ab -> compare (last a) (head b)) : The comparison is always O(1) , since the lists whose last element is busy have a limited length ( <= 2 ), so the whole sortBy operation sortBy O(n*log n)
  • concat again O(n) .

So we have O(n*log n) .

Please note, however, that

 cmp = \ab -> compare (last a) (head b) 

- an inconsistent comparison for the two lists a and b (say, [30,10] and [25,15] ), you can have

 cmp ab == cmp ba = LT 

I'm not sure your algorithm always works.

sortBy looked at the implementation of sortBy and following the sorting a bit in my head, I think that it works for this purpose (provided that the elements of the list are different), and an inconsistent comparison does not harm. For some sorting algorithms, inconsistent comparisons may result in sorting in a loop, but this should not happen for merge sort options.

+5
source

Well step by step

  • chunksOf 2 must chunksOf 2 over the entire list, so O(n) and half the length of the list. However, since constant factors do not affect complexity, we can ignore this.
  • map (sortBy... over the entire list O(n) , performing an operation with a constant time * O(1) = O(1*n) = O(n)
  • sortBy with constant time comparison * is O( n * log n)
  • concat which is O(n)

So, in general, O(n + n + n log n + n) = O ((3 + log n) * n) = O(n log n)

* Since lists are 2 or less in length, we can say that operations, such as sorting and accessing the last item, are O(2 * log 2) and O(2) respectively, which are constant time, O(1)

+4
source

Source: https://habr.com/ru/post/1482830/


All Articles