Say I have a class A
, B
and C
.
Class A
and B
are both mixin classes for Class C
.
class A( object ):
pass
class B( object ):
pass
class C( object, A, B ):
pass
This will not work when instantiating class C. I would have to remove object
from class C to make it work. (Else you'll get MRO problems).
TypeError: Error when calling the metaclass bases
Cannot create a consistent method resolution
order (MRO) for bases B, object, A
However, my case is a bit more complicated. In my case class C
is a server where A
and B
will be plugins that are loaded on startup. These are residing in their own folder.
I also have a Class named Cfactory
. In Cfactory I have a __new__
method that will create a fully functional object C. In the __new__
method I search for plugins, load them using __import__
, and then assign them to C.__bases__ += (loadedClassTypeGoesHere, )
So the following is a possibility: (made it quite abstract)
class A( object ):
def __init__( self ): pass
def printA( self ): print "A"
class B( object ):
def __init__( self ): pass
def printB( self ): print "B"
class C( object ):
def __init__( self ): pass
class Cfactory( object ):
def __new__( cls ):
C.__bases__ += ( A, )
C.__bases__ += ( B, )
return C()
This again will not work, and will give the MRO errors again:
TypeError: Cannot create a consistent method resolution
order (MRO) for bases object, A
An easy fix for this is removing the object
baseclass from A
and B
. However this will make them old-style objects which should be avoided when these plugins are being run stand-alone (which should be possible, UnitTest wise)
Another easy fix is removing object
from C
but this will also make it an old-style class and C.__bases__
will be unavailable thus I can't add extra objects to the base of C
What would be a good architectural solution for this and how would you do something like this? For now I can live with old-style classes for the plugins themselves. But I rather not use them.
Think of it this way -- you want the mixins to override some of the behaviors of object
, so they need to be before object
in the method resolution order.
So you need to change the order of the bases:
class C(A, B, object):
pass
Due to this bug, you need C
not to inherit from object directly to be able to correctly assign to __bases__
, and the factory really could just be a function:
class FakeBase(object):
pass
class C(FakeBase):
pass
def c_factory():
for base in (A, B):
if base not in C.__bases__:
C.__bases__ = (base,) + C.__bases__
return C()
I don't know the details, so maybe I'm completely off-base here, but it seems like you're using the wrong mechanisms to achieve your design.
First off, why is Cfactory
a class, and why does its __new__
method return an instance of something else? That looks like a bizarre way to implement what is quite naturally a function. Cfactory
as you've described it (and shown a simplified example) doesn't behave at all like a class; you don't have multiple instances of it that share functionality (in fact it looks like you've made it impossible to construct instances of naturally).
To be honest, C
doesn't look very much like a class to me either. It seems like you can't be creating more than one instance of it, otherwise you'd end up with an ever-growing bases list. So that makes C
basically a module rather than a class, only with extra boilerplate. I try to avoid the "single-instance class to represent the application or some external system" pattern (though I know it's popular because Java requires that you use it). But the class inheritance mechanism can often be handy for things that aren't really classes, such as your plugin system.
I would've done this with a classmethod on C
to find and load plugins, invoked by the module defining C
so that it's always in a good state. Alternatively you could use a metaclass to automatically add whatever plugins it finds to the class bases. Mixing the mechanism for configuring the class in with the mechanism for creating an instance of the class seems wrong; it's the opposite of flexible de-coupled design.
If the plugins can't be loaded at the time C
is created, then I would go with manually invoking the configurator classmethod at the point when you can search for plugins, before the C
instance is created.
Actually, if the class can't be put into a consistent state as soon as it's created I would probably rather go for dynamic class creation than modifying the bases of an existing class. Then the system isn't locked into the class being configured once and instantiated once; you're at least open to the possibility of having multiple instances with different sets of plugins loaded. Something like this:
def Cfactory(*args, **kwargs):
plugins = find_plugins()
bases = (C,) + plugins
cls = type('C_with_plugins', bases, {})
return cls(*args, **kwargs)
That way, you have your single call to create your C
instance with gives you a correctly configured instance, but it doesn't have strange side effects on any other hypothetical instances of C
that might already exist, and its behaviour doesn't depend on whether it's been run before. I know you probably don't need either of those two properties, but it's barely more code than you have in your simplified example, and why break the conceptual model of what classes are if you don't have to?
There is a simple workaround: Create a helper-class, with a nice name, like PluginBase. And use that the inherit of, instead of object.
This makes the code more readable (imho) and it circumstances the bug.
class PluginBase(object): pass
class ServerBase(object): pass
class pluginA(PluginBase): "Now it is clearly a plugin class"
class pluginB(PluginBase): "Another plugin"
class Server1(ServerBase, pluginA, pluginB): "This works"
class Server2(ServerBase): pass
Server2.__base__ += (PluginA,) # This also works
As note: Probably you don't need the factory; it's needed in C++, but hardly in Python