I'm still trying to wrap my head around how F # generalizes (or not) functions and types, and there is a case that listens to me:
let min(a, b) = if a < b then a else b let add(a, b) = a + b let minInt = min(3, 4) let minFloat = min(3.0, 4.0) // works! let addInt = add(3, 5) let addFloat = add(3.0, 5.0) // error: This expression was expected to have type // int but here has type float
Here min has the general type 'a * 'a -> 'a (requires comparison)
, and add has a specific type int * int -> int
, apparently derived from its first use in the program. Both are declared and used the same way, so why the difference in generalization?
I understand that in case of adding, the problem can be postponed by declaring the inline function, which forces it to get a definition of a general type, i.e. 'a * 'b -> 'c (requires member (+))
, but this does not explain why this is necessary in this case, and not in another.
Asik source share