The code
{-# LANGUAGE ScopedTypeVariables, TypeApplications #-}
-- I know this particular example is silly.
-- But that's not the point here.
g :: forall a . RealFloat a => Bool
g = True
main :: IO ()
main = print (g @Double)
fails to compile on GHC 8.0 with the error
• Could not deduce (RealFloat a0)
from the context: RealFloat a
bound by the type signature for:
g :: RealFloat a => Bool
at app/Main.hs:3:6-35
The type variable ‘a0’ is ambiguous
• In the ambiguity check for ‘g’
To defer the ambiguity check to use sites, enable AllowAmbiguousTypes
In the type signature:
g :: forall a. RealFloat a => Bool
So adding AllowAmbiguousTypes
will make the code compile.
Here are my questions:
- What exactly is
AllowAmbiguousTypes
?
- Why is it needed to make this particular code work?
- I fear that adding
AllowAmbiguousTypes
is giving me more than I really want in this particular code. It sounds scary. It sounds like it will make Haskell's type system less safe, perhaps in other areas that have nothing to do with this particular code. Are these fears unfounded?
- Are there any alternatives? In this case, it seems like Haskell is inserting a
a0
type variable that I never asked for. Is there no extension to tell Haskell not to create these extraneous type variables - and only use those that I explicitly told it to add with my own explicit forall a
?
- Added one question because of user2407038's comment: Would you say that
AllowAmbiguousTypes
is a misnomer? Would it have been better named as AllowUnusedTypeVariables
?
What exactly is AllowAmbiguousTypes
?
From the latest GHC docs, "a type ty
is ambiguous if and only if ((undefined :: ty) :: ty)
would fail to typecheck". The extension AllowAmbiguousTypes
just disables this check - it won't allow ill-typed programs through.
Why is it needed to make this particular code work?
In order to check that RealFloat a
is satisfied whenever g
is used, GHC needs to know what a
is. You have not way (in vanilla Haskell1) of telling GHC what a
should be since a
occurs nowhere else in the type of g
. No amount of annotations will let you use g
without getting an ambiguous type variable error.
However, if you don't use g
anywhere, you can still get your code to compile by turning on AllowAmbiguousTypes
.
I fear that adding AllowAmbiguousTypes
is giving me more than I really want in this particular code. It sounds scary. It sounds like it will make Haskell's type system less safe, perhaps in other areas that have nothing to do with this particular code. Are these fears unfounded?
Yes they are: the ambiguity check lets you catch definitions which cannot (in vanilla Haskell, which doesn't have TypeApplications
1) be used without resulting in an ambiguous type variable error. Disabling this check just means you will see the ambiguous type variable messages when you use the expression (or function) instead of at its definition site.
Are there any alternatives? In this case, it seems like Haskell is inserting a a0
type variable that I never asked for. Is there no extension to tell Haskell not to create these extraneous type variables - and only use those that I explicitly told it to add with my own explicit forall a
?
The a0
is coming from the ambiguity check I mentioned at the beginning of this answer. GHC just uses the name a0
to make it clear it is different from a
. The ambiguity check basically just tries to typecheck
((undefined :: forall a. RealFloat a => Bool) :: forall a0. RealFloat a0 => Bool)
AllowAmbiguousTypes
removes this check. I don't think there is an extension that disables ambiguity checks only on type signatures with explicit forall
s (although this might be neat and useful!).
Would you say that AllowAmbiguousTypes
is a misnomer? Would it have been better named as AllowUnusedTypeVariables
?
Naming things is hard. :)
The current name references the type of errors you get without the extension enabled, so it isn't a bad name. I guess this is a matter of opinion. (A lot of people also wish Monad
was called something like FlatMapAble
.)
1 Prior to TypeApplications
(which is a relatively new extension from GHC 8.0), there really was no way of using expressions that triggered the ambiguity check without getting an ambiguous type variable error, so AllowAmbiguousTypes
was a lot less useful.