This is due to how ++ works. Note that this is determined by induction on its first argument.
[] ++ ys = ys (x:xs) ++ ys = x : (xs ++ ys)
The number of recursive calls depends only on length xs .
Because of this, in xs ++ ys it is more convenient to have small xs and large ys than vice versa.
Note that bending to the right achieves this:
[1] ++ ([2] ++ ([3] ++ ...
We have only short (O (1) -line) to the left of ++ , which makes the cost of adding O (n).
Instead, folding left is bad:
((([] ++ [1]) ++ [2]) ++ [3]) ++ ...
Leftist arguments are getting bigger and bigger. Therefore, here we get the complexity O (n ^ 2).
This argument can be clarified considering how an output list is required. You may notice that foldr gives its result in a "streaming" way, requiring, for example, the first cell of the output list only forces a small part of the input - you only need to execute the first ++ , and in fact, only its first recursive step is needed! Instead, only the first element from the result of foldl , which will make all ++ calls make it quite expensive (even if each call needs only one recursive step, there are O (n) calls).
source share