Numeric literals have a polymorphic type:
*Main> :t 3
3 :: (Num t) => t
But if I bind a variable to such a literal, the polymorphism is lost:
x = 3
...
*Main> :t x
x :: Integer
If I define a function, on the other hand, it is of course polymorphic:
f x = 3
...
*Main> :t f
f :: (Num t1) => t -> t1
I could provide a type signature to ensure the x
remains polymorphic:
x :: Num a => a
x = 3
...
*Main> :t x
x :: (Num a) => a
But why is this necessary? Why isn't the polymorphic type inferred?
To expand on sepp2k's answer a bit: if you try to compile the following (or load it into GHCi), you get an error:
This is a violation of the monomorphism restriction because we have a class constraint (introduced by
sort
) but no explicit arguments: we're (somewhat mysteriously) told that we have anAmbiguous type variable
in the constraintOrd a
.Your example (
let x = 3
) has a similarly ambiguous type variable, but it doesn't give the same error, because it's saved by Haskell's "defaulting" rules:See this answer for more information about the defaulting rules—the important point is that they only work for certain numeric classes, so
x = 3
is fine whilef = sort
isn't.As a side note: if you'd prefer that
x = 3
end up being anInt
instead of anInteger
, andy = 3.0
be aRational
instead of aDouble
, you can use a "default declaration" to override the default defaulting rules:It's the monomorphism restriction which says that all values, which are defined without parameters and don't have an explicit type annotation, should have a monomorphic type. This restriction can be disabled in ghc and ghci using
-XNoMonomorphismRestriction
.The reason for the restriction is that without this restriction
long_calculation 42
would be evaluated twice, while most people would probably expect/want it to only be evaluated once: