Why toInteger :: Int & # 8594; Integer is lazy?

I have the following code:

{-# NOINLINE i2i #-} i2i :: Int -> Integer i2i x = toInteger x main = print $ i2i 2 

Running GHC with the -ddump-simple flag gives:

 [Arity 1 NoCafRefs Str: DmdType U(L)] Main.i2i = GHC.Real.toInteger1 

Converting from Int to Integer seems to be lazy. Why is this so - is there a case where I can have

 (toInteger _|_ ::Int) /= _|_ 

?

Edit: The question is more about the GHC rigor analyzer than about laziness per se. This code was obtained from studying the standard mean function:

 --mean :: Integer -> Integer -> [Integer] -> Double mean :: Integer -> Int -> [Integer] -> Double mean acc n [] = fromIntegral acc / fromIntegral n mean acc n (x:xs) = mean (acc + x) (n + 1) xs main = print $ mean 0 0 [1..1000000] 

This code works in O (N) space. When I uncomment the first line, the space consumption changes to O (1). It seems like it comes down to calling Integral, which in turn comes down to toInteger. Strictness analyst somehow cannot conclude that the transformation is strict, which seems strange to me.

+4
source share
2 answers

Answer to your edit: The dangers of O (N) space leaks for accumulating parameters are well known, at least for Haskell programmers. What should be well known, but it is not, no matter what language you should never trust the optimizer to provide asymptotic guarantees for the spatial and temporal behavior of your programs. understand the implications of simple optimizers that I wrote myself, not to mention something hairy, like the front end of the GHC, with the help of a severity analyzer, inliner and everything else.

Regarding your two questions,

Why doesn't the GHC string analyzer optimize this code if it optimizes very similar code?

Who knows? (Perhaps Simon PJ knows, maybe not.) If you care about performance, you should not rely on a strictness analyzer. This brings us to the second, implied question:

How can I avoid O (N) space overhead for this function and for every other function that uses cumulative parameters?

By placing severity annotations on the accumulation parameters that cause them to be evaluated for each tail recursive call.

+13
source

I think you are looking at it wrong. Consider the following, stupid piece of code

 let x = [undefined] let y = map toInteger x 

If we appreciate

  y == [] 

we get False , whereas if we evaluate

  head y 

we get an undefined exception. There is no reason to believe that applying map or comparing y with [] should diverge just because the only element x is undefined. This is the essence of laxity.

+2
source

Source: https://habr.com/ru/post/1308543/


All Articles