Unification of Associated Synonyms with Class Constraints

I have a nested type that I want to partially specify using the associated type synonyms. Below is some extremely abbreviated code that demonstrates the problem:

{-# LANGUAGE TypeFamilies, FlexibleInstances, FlexibleContexts, MultiParamTypeClasses #-} f :: Int -> Int fx = x+1 data X ti newtype Z i = Z i deriving (Eq,Show) newtype Y q = Y (T q) class Qux mr where g :: (T m) -> r class (Integral (T a)) => Foo a where type T a -- ... functions instance (Integral i) => Foo (X (Z i) i) where type T (X (Z i) i) = i instance (Foo q) => Num (Y q) -- where ... instance (Foo q, Foo (X m' Int)) => Qux (X m' Int) (Y q) where gx = fromIntegral $ f $ x 

which (even with UndecidableInstances) leads to a compiler error:

 Could not deduce (T (X m' Int) ~ Int) 

I know that adding this restriction to a Qux instance makes the compiler happy. However, I know that in my program (T (X arg1 arg2)) = arg2, so I'm trying to figure out how not to write this restriction.

My question is: can I make Haskell understand that when I write 'Int' as the second parameter of X, it is (identical) the same as the synonym T (X a 'Int)? I understand that I am using β€œspecial” knowledge of how my instances will look, but I think there should be a way to pass this to the compiler.

thanks

0
source share
1 answer

Since I'm not sure I understood the question, I am going to discuss the code you wrote; perhaps some part of my incoherent will either point you in a useful direction, or distort some sharp questions that may answer me. That is: warning! rambling non-response ahead!

First, let's talk about the Bar class.

 -- class (i ~ S b) => Bar bi where -- type S b -- ... 

Since we know the restriction i ~ S b , we could also drop the argument i , and I will do this for the rest of the discussion.

 class Bar b where type S b -- or, if the class is empty, just -- type family S b -- with no class declaration at all 

Here's what the instances for this new class look like:

 instance Bar (Z i) where type S (Z i) = i instance Bar (Z' i) where type S (Z' i) = i 

If this is intended to be true for any type constructor, you can consider this as a single instance instead:

 -- instance Bar (fi) where type S (fi) = i 

Now let's talk about the Foo class. To change it in accordance with the above, we will write

 class Bar (T a) => Foo a where type T a 

You declared two instances:

 -- instance (Bar (Z i) i) => Foo (X (Z i) i) where -- type T (X (Z i) i) = Z i -- -- instance (Bar (Z' i) i) => Foo (X' (Z' i) i) where -- type T (X (Z' i) i) = Z' i 

We can put the second argument in Bar , as before, but we can do something else too: since we know there an instance of Bar (Z i) (we declared it above!), We can remove the restriction of the instance.

 instance Foo (X (Z i) i) where type T (X (Z i) i) = Z i instance Foo (X (Z' i) i) where type T (X (Z' i) i) = Z' i 

If you want to keep the argument i in X or not dependent on what X should represent. So far, we have not changed the semantics of any class declarations or data types β€” just how they were declared and instantiated. Change X may not have the same property; hard to say without seeing the definition of X With data declarations and enough extensions, the prelude above compiles.

Now, your complaints:

  • You say the following does not compile:

     class Qux a instance Foo (X a' Int) => Qux (X a' Int) instance Foo (X' a' Int) => Qux (X' a' Int) 

    But with UndecidableInstances , which compiles here. And it makes sense for him to need UndecidableInstances : there is nothing that could stop someone from coming later and declaring an instance of type

     instance Qux (XY Int) => Foo (XY Int) Then, checking whether `Qux (XY Int)` had an instance would require checking whether `Foo (XY Int)` had an instance and vice versa. Loop. 
  • You say: "He also wants an instance restriction S (T (X a'))) ~ Int , although I know that in my program these are always just synonyms." I don’t know what the first β€œit” is - I cannot reproduce this error, therefore I cannot answer it very well. In addition, as I complained earlier, this restriction is not good: X :: (* -> *) -> * -> * , so X a' :: * -> * and T expects an argument of the form * . So I'm going to suggest that you mean S (T (X a' Int)) ~ Int .

    Despite these complaints, we may ask why S (T (X a' Int)) ~ Int not proved from the assumptions we have so far. So far, we have only declared Foo instances for types that match the pattern X (Z i) i and X (Z' i) i . Thus, a function of type T can only decrease when the type of the argument matches one of these patterns. Type X a' Int not suitable for any of these patterns. We could squeeze it into the correct pattern: instead, we could try decreasing X (Z Int) Int (say). Then we find that T (X (Z Int) Int) ~ Z Int and, therefore, S (T (X (Z Int) Int) ~ S (Z Int) ~ Int .

    This explains how to fix level reduction at a level, but does not explain how to fix any code that is not built. To do this, we need to find out why a type check requires coercion from S (T (X a' Int)) to Int , and see if we can convince it that a more specific (and feasible) coercion is similar to S (T (X (Z Int) Int)) ~ Int , or, with the proposed generalized instance of Bar above, S (T (X (f Int) Int)) ~ Int . Of course, we cannot help you without having enough code to reproduce your error.

  • You ask: "Can I make Haskell realize that when I write" Int "as the second parameter of X, it is the same as the synonym S (T (X a 'Int))?". I don’t understand this question at all. Do you want to somehow have the provable equality X a Int ~ S (T (X a' Int)) ? What I get from literally reading your question.

    In context, I think you might want to ask if you can get the provable equality b ~ S (T (X ab)) ; in this case, the answer is "Definitely!". We will abuse the fact that we know above that b ~ S (Z b) to reduce this equation to a stronger Z b ~ T (X ab) . Then we can simply replace the instances of Foo above with the one that makes this declaration, and nothing more:

     instance Foo (X ab) where T (X ab) = Z b 

    Alternatively, we could postulate another reasonable equation, T (X ab) ~ a ; then, to prove that S (T (X ab)) ~ b we just need to prove that S a ~ b (by abbreviation T ), so we could write this other alternative instance of Foo :

     instance (Bar a, S a ~ b) => Foo (X ab) where T (X ab) = a 
+1
source

Source: https://habr.com/ru/post/988359/


All Articles