I'm puzzled by how the haskell compiler sometimes infers types that are less polymorphic than what I'd expect, for example when using point-free definitions.
It seems like the issue is the "monomorphism restriction", which is on by default on older versions of the compiler.
Consider the following haskell program:
{-# LANGUAGE MonomorphismRestriction #-}
import Data.List(sortBy)
plus = (+)
plus' x = (+ x)
sort = sortBy compare
main = do
print $ plus' 1.0 2.0
print $ plus 1.0 2.0
print $ sort [3, 1, 2]
If I compile this with ghc
I obtain no erros and the output of the executable is:
3.0
3.0
[1,2,3]
If I change the main
body to:
main = do
print $ plus' 1.0 2.0
print $ plus (1 :: Int) 2
print $ sort [3, 1, 2]
I get no compile time errors and the output becomes:
3.0
3
[1,2,3]
as expected. However if I try to change it to:
main = do
print $ plus' 1.0 2.0
print $ plus (1 :: Int) 2
print $ plus 1.0 2.0
print $ sort [3, 1, 2]
I get a type error:
test.hs:13:16:
No instance for (Fractional Int) arising from the literal ‘1.0’
In the first argument of ‘plus’, namely ‘1.0’
In the second argument of ‘($)’, namely ‘plus 1.0 2.0’
In a stmt of a 'do' block: print $ plus 1.0 2.0
The same happens when trying to call sort
twice with different types:
main = do
print $ plus' 1.0 2.0
print $ plus 1.0 2.0
print $ sort [3, 1, 2]
print $ sort "cba"
produces the following error:
test.hs:14:17:
No instance for (Num Char) arising from the literal ‘3’
In the expression: 3
In the first argument of ‘sort’, namely ‘[3, 1, 2]’
In the second argument of ‘($)’, namely ‘sort [3, 1, 2]’
- Why does
ghc
suddenly think thatplus
isn't polymorphic and requires anInt
argument? The only reference toInt
is in an application ofplus
, how can that matter when the definition is clearly polymorphic? - Why does
ghc
suddenly think thatsort
requires aNum Char
instance?
Moreover if I try to place the function definitions into their own module, as in:
{-# LANGUAGE MonomorphismRestriction #-}
module TestMono where
import Data.List(sortBy)
plus = (+)
plus' x = (+ x)
sort = sortBy compare
I get the following error when compiling:
TestMono.hs:10:15:
No instance for (Ord a0) arising from a use of ‘compare’
The type variable ‘a0’ is ambiguous
Relevant bindings include
sort :: [a0] -> [a0] (bound at TestMono.hs:10:1)
Note: there are several potential instances:
instance Integral a => Ord (GHC.Real.Ratio a)
-- Defined in ‘GHC.Real’
instance Ord () -- Defined in ‘GHC.Classes’
instance (Ord a, Ord b) => Ord (a, b) -- Defined in ‘GHC.Classes’
...plus 23 others
In the first argument of ‘sortBy’, namely ‘compare’
In the expression: sortBy compare
In an equation for ‘sort’: sort = sortBy compare
- Why isn't
ghc
able to use the polymorphic typeOrd a => [a] -> [a]
forsort
? - And why does
ghc
treatplus
andplus'
differently?plus
should have the polymorphic typeNum a => a -> a -> a
and I don't really see how this is different from the type ofsort
and yet onlysort
raises an error.
Last thing: if I comment the definition of sort
the file compiles. However
if I try to load it into ghci
and check the types I get:
*TestMono> :t plus
plus :: Integer -> Integer -> Integer
*TestMono> :t plus'
plus' :: Num a => a -> a -> a
Why isn't the type for plus
polymorphic?
This is the canonical question about monomorphism restriction in Haskell as discussed in the meta question.
What is the monomorphism restriction?
The monomorphism restriction as stated by the Haskell wiki is:
What this means is that, in some circumstances, if your type is ambiguous (i.e. polymorphic) the compiler will choose to instantiate that type to something not ambiguous.
How do I fix it?
First of all you can always explicitly provide a type signature and this will avoid the triggering of the restriction:
Alternatively, if you are defining a function, you can avoid point-free style, and for example write:
Turning it off
It is possible to simply turn off the restriction so that you don't have to do anything to your code to fix it. The behaviour is controlled by two extensions:
MonomorphismRestriction
will enable it (which is the default) whileNoMonomorphismRestriction
will disable it.You can put the following line at the very top of your file:
If you are using GHCi you can enable the extension using the
:set
command:You can also tell
ghc
to enable the extension from the command line:Note: You should really prefer the first option over choosing extension via command-line options.
Refer to GHC's page for an explanation of this and other extensions.
A complete explanation
I'll try to summarize below everything you need to know to understand what the monomorphism restriction is, why it was introduced and how it behaves.
An example
Take the following trivial definition:
you'd think to be able to replace every occurrence of
+
withplus
. In particular since(+) :: Num a => a -> a -> a
you'd expect to also haveplus :: Num a => a -> a -> a
.Unfortunately this is not the case. For example in we try the following in GHCi:
We get the following output:
You may need to
:set -XMonomorphismRestriction
in newer GHCi versions.And in fact we can see that the type of
plus
is not what we would expect:What happened is that the compiler saw that
plus
had typeNum a => a -> a -> a
, a polymorphic type. Moreover it happens that the above definition falls under the rules that I'll explain later and so he decided to make the type monomorphic by defaulting the type variablea
. The default isInteger
as we can see.Note that if you try to compile the above code using
ghc
you won't get any errors. This is due to howghci
handles (and must handle) the interactive definitions. Basically every statement entered inghci
must be completely type checked before the following is considered; in other words it's as if every statement was in a separate module. Later I'll explain why this matter.Some other example
Consider the following definitions:
We'd expect all these functions to behave in the same way and have the same type, i.e. the type of
show
:Show a => a -> String
.Yet when compiling the above definitions we obtain the following errors:
So
f2
andf4
don't compile. Moreover when trying to define these function in GHCi we get no errors, but the type forf2
andf4
is() -> String
!Monomorphism restriction is what makes
f2
andf4
require a monomorphic type, and the different behaviour bewteenghc
andghci
is due to different defaulting rules.When does it happen?
In Haskell, as defined by the report, there are two distinct type of bindings. Function bindings and pattern bindings. A function binding is nothing else than a definition of a function:
Note that their syntax is:
Modulo guards and
where
declarations. But they don't really matter.where there must be at least one argument.
A pattern binding is a declaration of the form:
Again, modulo guards.
Note that variables are patterns, so the binding:
is a pattern binding. It's binding the pattern
plus
(a variable) to the expression(+)
.When a pattern binding consists of only a variable name it's called a simple pattern binding.
The monomorphism restriction applies to simple pattern bindings!
Well, formally we should say that:
Section 4.5.1 of the report.
And then (Section 4.5.5 of the report):
Examples added by me.
So a restricted declaration group is a group where, either there are non-simple pattern bindings (e.g.
(x:xs) = f something
or(f, g) = ((+), (-))
) or there is some simple pattern binding without a type signature (as inplus = (+)
).The monomorphism restriction affects restricted declaration groups.
Most of the time you don't define mutual recursive functions and hence a declaration group becomes just a binding.
What does it do?
The monomorphism restriction is described by two rules in Section 4.5.5 of the report.
First rule
The highlighted part is what the monomorphism restriction introduces. It says that if the type is polymorphic (i.e. it contain some type variable) and that type variable is constrained (i.e. it has a class constraint on it: e.g. the type
Num a => a -> a -> a
is polymorphic because it containsa
and also contrained because thea
has the constraintNum
over it.) then it cannot be generalized.In simple words not generalizing means that the uses of the function
plus
may change its type.If you had the definitions:
then you'd get a type error. Because when the compiler sees that
plus
is called over anInteger
in the declaration ofx
it will unify the type variablea
withInteger
and hence the type ofplus
becomes:but then, when it will type check the definition of
y
, it will see thatplus
is applied to aDouble
argument, and the types don't match.Note that you can still use
plus
without getting an error:In this case the type of
plus
is first inferred to beNum a => a -> a -> a
but then its use in the definition ofx
, where1.0
requires aFractional
constraint, will change it toFractional a => a -> a -> a
.Rationale
The report says:
For this point the example from the wiki is, I believe, clearer. Consider the function:
If
len
was polymorphic the type off
would be:So the two elements of the tuple
(len, len)
could actually be different values! But this means that the computation done bygenericLength
must be repeated to obtain the two different values.The rationale here is: the code contains one function call, but not introducing this rule could produce two hidden function calls, which is counter intuitive.
With the monomorphism restriction the type of
f
becomes:In this way there is no need to perform the computation multiple times.
Well, I believe this example is self-explanatory. There are situations when not applying the rule results in type ambiguity.
If you disable the extension as suggest above you will get a type error when trying to compile the above declaration. However this isn't really a problem: you already know that when using
read
you have to somehow tell the compiler which type it should try to parse...Second rule
This means that. If you have your usual definition:
This will have a type
Num a => a -> a -> a
wherea
is a monomorphic type variable due to rule 1 described above. Once the whole module is inferred the compiler will simply choose a type that will replace thata
according to the defaulting rules.The final result is:
plus :: Integer -> Integer -> Integer
.Note that this is done after the whole module is inferred.
This means that if you have the following declarations:
inside a module, before type defaulting the type of
plus
will be:Fractional a => a -> a -> a
(see rule 1 for why this happens). At this point, following the defaulting rules,a
will be replaced byDouble
and so we will haveplus :: Double -> Double -> Double
andx :: Double
.Defaulting
As stated before there exist some defaulting rules, described in Section 4.3.4 of the Report, that the inferencer can adopt and that will replace a polymorphic type with a monomorphic one. This happens whenever a type is ambiguous.
For example in the expression:
here the expression is ambiguous because the types for
show
andread
are:So the
x
has typeRead a => a
. But this constraint is satisfied by a lot of types:Int
,Double
or()
for example. Which one to choose? There's nothing that can tell us.In this case we can resolve the ambiguity by telling the compiler which type we want, adding a type signature:
Now the problem is: since Haskell uses the
Num
type class to handle numbers, there are a lot of cases where numerical expressions contain ambiguities.Consider:
What should the result be?
As before
1
has typeNum a => a
and there are many type of numbers that could be used. Which one to choose?Having a compiler error almost every time we use a number isn't a good thing, and hence the defaulting rules were introduced. The rules can be controlled using a
default
declaration. By specifyingdefault (T1, T2, T3)
we can change how the inferencer defaults the different types.An ambiguous type variable
v
is defaultable if:v
appears only in contraints of the kindC v
wereC
is a class (i.e. if it appears as in:Monad (m v)
then it is not defaultable).Num
or a subclass ofNum
.A defaultable type variable is replaced by the first type in the
default
list that is an instance of all the ambiguous variable’s classes.The default
default
declaration isdefault (Integer, Double)
.For example:
The types inferred would be:
which, by defaulting rules, become:
Note that this explains why in the example in the question only the
sort
definition raises an error. The typeOrd a => [a] -> [a]
cannot be defaulted becauseOrd
isn't a numeric class.Extended defaulting
Note that GHCi comes with extended defaulting rules (or herefor GHC8), which can be enabled in files as well using the
ExtendedDefaultRules
extensions.The defaultable type variables need not only appear in contraints where all the classes are standard and there must be at least one class that is among
Eq
,Ord
,Show
orNum
and its subclasses.Moreover the default
default
declaration isdefault ((), Integer, Double)
.This may produce odd results. Taking the example from the question:
in ghci we don't get a type error but the
Ord a
constraints results in a default of()
which is pretty much useless.Useful links
There are a lot of resources and discussions about the monomorphism restriction.
Here are some links that I find useful and that may help you understand or deep further into the topic: