I get type input errors because the GHC will not call the constraint variable. It looks disadvantageous due to first-order unification. In further research, I found that inserting let-bindings changes the behavior of type inference. I would like to know what the GHC does.
The code is shown here. A new type ConstrainedF cdenotes a polymorphic function whose type parameter is limited c. As far as I can tell, the GHC will not output cbased on the values specified in ConstrainedF.
{-# LANGUAGE RankNTypes, ScopedTypeVariables, ConstraintKinds, MonoLocalBinds #-}
import Data.Monoid
import GHC.Prim(Constraint)
newtype ConstrainedF c =
ConstrainedF { runConstrainedF :: forall a. c a => [a] -> a}
applyConstrainedF :: forall c a. c a => ConstrainedF c -> [a] -> a
applyConstrainedF f xs = runConstrainedF f xs
-- GHC cannot infer the type parameter of ConstrainedF
foo :: [Int]
foo = applyConstrainedF (ConstrainedF mconcat) [[1], [2]]
--foo = applyConstrainedF (ConstrainedF mconcat :: ConstrainedF Monoid) [[1], [2]]
It should be possible to infer types in the application ConstrainedF mconcat:
ConstrainedFhas type forall c. (forall a. c a => [a] -> a) -> ConstrainedF c.mconcathas type forall b. Monoid b => [b] -> b.forall b. Monoid b => [b] -> bcombines with the forall a. c a => [a] -> adestination path a := band c := Monoid.
However, the GHC complains:
Could not deduce (Monoid a) arising from a use of `mconcat'
from the context (c0 a).
, GHC ?
. , . c, :
data Cst1 (c :: * -> Constraint) = Cst1
monoid :: Cst1 Monoid
monoid = Cst1
applyConstrainedF :: forall c a. c a => ConstrainedF c -> Cst1 c -> [a] -> a
applyConstrainedF f _ xs = runConstrainedF f xs
foo :: [Int]
foo = applyConstrainedF (ConstrainedF mconcat) monoid [[1], [2]]
let foo c Monoid.
foo_doesn't_work :: [Int]
foo_doesn't_work = let cf = ConstrainedF mconcat
in applyConstrainedF cf monoid [[1], [2]]
, , GHC , . , .