I know it's not Pythonic to write functions that care about the type of the arguments, but there are cases when it's simply impossible to ignore types because they are handled differently.
Having a bunch of isinstance
checks in your function is just ugly; is there any function decorator available that enables function overloads? Something like this:
@overload(str)
def func(val):
print('This is a string')
@overload(int)
def func(val):
print('This is an int')
Update:
Here's some comments I left on David Zaslavsky's answer:
With a few modification[s], this will suit my purposes pretty well. One other limitation I noticed in your implementation, since you use func.__name__
as the dictionary key, you are prone to name collisions between modules, which is not always desirable. [cont'd]
[cont.] For example, if I have one module that overloads func
, and another completely unrelated module that also overloads func
, these overloads will collide because the function dispatch dict is global. That dict should be made local to the module, somehow. And not only that, it should also support some kind of 'inheritance'. [cont'd]
[cont.] By 'inheritance' I mean this: say I have a module first
with some overloads. Then two more modules that are unrelated but each import first
; both of these modules add new overloads to the already existing ones that they just imported. These two modules should be able to use the overloads in first
, but the new ones that they just added should not collide with each other between modules. (This is actually pretty hard to do right, now that I think about it.)
Some of these problems could possibly be solved by changing the decorator syntax a little bit:
first.py
@overload(str, str)
def concatenate(a, b):
return a + b
@concatenate.overload(int, int)
def concatenate(a, b):
return str(a) + str(b)
second.py
from first import concatenate
@concatenate.overload(float, str)
def concatenate(a, b):
return str(a) + b
Quick answer: there is an overload package on PyPI which implements this more robustly than what I describe below, although using a slightly different syntax. It's declared to work only with Python 3 but it looks like only slight modifications (if any, I haven't tried) would be needed to make it work with Python 2.
Long answer: In languages where you can overload functions, the name of a function is (either literally or effectively) augmented by information about its type signature, both when the function is defined and when it is called. When a compiler or interpreter looks up the function definition, then, it uses both the declared name and the types of the parameters to resolve which function to access. So the logical way to implement overloading in Python is to implement a wrapper that uses both the declared name and the parameter types to resolve the function.
Here's a simple implementation:
from collections import defaultdict
def determine_types(args, kwargs):
return tuple([type(a) for a in args]), \
tuple([(k, type(v)) for k,v in kwargs.iteritems()])
function_table = defaultdict(dict)
def overload(arg_types=(), kwarg_types=()):
def wrap(func):
named_func = function_table[func.__name__]
named_func[arg_types, kwarg_types] = func
def call_function_by_signature(*args, **kwargs):
return named_func[determine_types(args, kwargs)](*args, **kwargs)
return call_function_by_signature
return wrap
overload
should be called with two optional arguments, a tuple representing the types of all positional arguments and a tuple of tuples representing the name-type mappings of all keyword arguments. Here's a usage example:
>>> @overload((str, int))
... def f(a, b):
... return a * b
>>> @overload((int, int))
... def f(a, b):
... return a + b
>>> print f('a', 2)
aa
>>> print f(4, 2)
6
>>> @overload((str,), (('foo', int), ('bar', float)))
... def g(a, foo, bar):
... return foo*a + str(bar)
>>> @overload((str,), (('foo', float), ('bar', float)))
... def g(a, foo, bar):
... return a + str(foo*bar)
>>> print g('a', foo=7, bar=4.4)
aaaaaaa4.4
>>> print g('b', foo=7., bar=4.4)
b30.8
Shortcomings of this include
It doesn't actually check that the function the decorator is applied to is even compatible with the arguments given to the decorator. You could write
@overload((str, int))
def h():
return 0
and you'd get an error when the function was called.
It doesn't gracefully handle the case where no overloaded version exists corresponding to the types of the arguments passed (it would help to raise a more descriptive error)
It distinguishes between named and positional arguments, so something like
g('a', 7, bar=4.4)
doesn't work.
- There are a lot of nested parentheses involved in using this, as in the definitions for
g
.
- As mentioned in the comments, this doesn't deal with functions having the same name in different modules.
All of these could be remedied with enough fiddling, I think. In particular, the issue of name collisions is easily resolved by storing the dispatch table as an attribute of the function returned from the decorator. But as I said, this is just a simple example to demonstrate the basics of how to do it.
This doesn't directly answer your question, but if you really want to have something that behaves like an overloaded function for different types and (quite rightly) don't want to use isinstance then I'd suggest something like:
def func(int_val=None, str_val=None):
if sum(x != None for x in (int_val, str_val)) != 1:
#raise exception - exactly one value should be passed in
if int_val is not None:
print('This is an int')
if str_val is not None:
print('This is a string')
In use the intent is obvious, and it doesn't even require the different options to have different types:
func(int_val=3)
func(str_val="squirrel")