In Haskell, does integral typeclass imply Show typeclass?

I tried to compile this code.

symmetric [] = True
symmetric [_] = True
symmetric l
    | (head l) == (last l) = symmetric (tail (init l))
    | otherwise = False

isPalindrome :: Integral a => a -> Bool
isPalindrome n = symmetric (show n)

This code did not compile, and I received a not-so-long error message indicating that it could not print (Show a).

Could not deduce (Show a) arising from a use ofshowfrom the context (Integral a)
  bound by the type signature for
             isPalindrome :: Integral a => a -> Bool
  at 4.hs:7:17-39
Possible fix:
  add (Show a) to the context of
    the type signature for isPalindrome :: Integral a => a -> Bool
In the first argument of ‘symmetric’, namely ‘(show n)’
In the expression: symmetric (show n)
In an equation for ‘isPalindrome’:
    isPalindrome n = symmetric (show n)

It worked after changing this line

isPalindrome :: Integral a => a -> Bool

to

isPalindrome :: (Show a, Integral a) => a -> Bool

So, I thought that every type of Integral is in Show, the Haskell compiler should be able to deduce (Show a) from (Integral a).

+4
source share
1 answer

So I thought, since each type of Integral is in Show

But not every type Integralis in Show. This was the case at Haskell98, due

class Show n => Num n

( " ", ..). Haskell Show Integral , .

- Show; showInt .

import Numeric (showInt)
isPalindrome :: Integral a => a -> Bool
isPalindrome n = symmetric $ showInt n []
+9

Source: https://habr.com/ru/post/1668100/


All Articles