django-rest-framework, multitable model inheritanc

2019-02-04 16:33发布

问题:

I can't find this info in the docs or on the interwebs.
latest django-rest-framework, django 1.6.5

How does one create a ModelSerializer that can handle a nested serializers where the nested model is implemented using multitable inheritance?

e.g.

######## MODELS
class OtherModel(models.Model):
    stuff = models.CharField(max_length=255)

class MyBaseModel(models.Model):
    whaddup = models.CharField(max_length=255)
    other_model = models.ForeignKey(OtherModel)

class ModelA(MyBaseModel):
    attr_a = models.CharField(max_length=255)

class ModelB(MyBaseModel):
    attr_b = models.CharField(max_length=255)


####### SERIALIZERS
class MyBaseModelSerializer(serializers.ModelSerializer):
    class Meta:
        model=MyBaseModel

class OtherModelSerializer(serializer.ModelSerializer):
    mybasemodel_set = MyBaseModelSerializer(many=True)

    class Meta:
        model = OtherModel

This obviously doesn't work but illustrates what i'm trying to do here.
In OtherModelSerializer, I'd like mybasemodel_set to serialize specific represenntations of either ModelA or ModelB depending on what we have.

If it matters, I'm also using django.model_utils and inheritencemanager so i can retrieve a queryset where each instance is already an instance of appropriate subclass.

Thanks

回答1:

I've solved this issue a slightly different way.

Using:

  • DRF 3.5.x
  • django-model-utils 2.5.x

My models.py look like this:

class Person(models.Model):
    first_name = models.CharField(max_length=40, blank=False, null=False)
    middle_name = models.CharField(max_length=80, blank=True, null=True)
    last_name = models.CharField(max_length=80, blank=False, null=False)
    family = models.ForeignKey(Family, blank=True, null=True)


class Clergy(Person):
    category = models.IntegerField(choices=CATEGORY, blank=True, null=True)
    external = models.NullBooleanField(default=False, null=True)
    clergy_status = models.ForeignKey(ClergyStatus, related_name="%(class)s_status", blank=True, null=True)


class Religious(Person):
    religious_order = models.ForeignKey(ReligiousOrder, blank=True, null=True)
    major_superior = models.ForeignKey(Person, blank=True, null=True, related_name="%(class)s_superior")


class ReligiousOrder(models.Model):
    name = models.CharField(max_length=255, blank=False, null=False)
    initials = models.CharField(max_length=20, blank=False, null=False)


class ClergyStatus(models.Model):
    display_name = models.CharField(max_length=255, blank=True, null=True)
    description = models.CharField(max_length=255, blank=True, null=True)

Basically - The base model is the "Person" model - and a person can either be Clergy, Religious, or neither and simply be a "Person". While the models that inherit Person have special relationships as well.

In my views.py I utilize a mixin to "inject" the subclasses into the queryset like so:

class PersonSubClassFieldsMixin(object):

    def get_queryset(self):
        return Person.objects.select_subclasses()

class RetrievePersonAPIView(PersonSubClassFieldsMixin, generics.RetrieveDestroyAPIView):
    serializer_class = PersonListSerializer
    ...

And then real "unDRY" part comes in serializers.py where I declare the "base" PersonListSerializer, but override the to_representation method to return special serailzers based on the instance type like so:

class PersonListSerializer(serializers.ModelSerializer):

    def to_representation(self, instance):
        if isinstance(instance, Clergy):
            return ClergySerializer(instance=instance).data
        elif isinstance(instance, Religious):
            return ReligiousSerializer(instance=instance).data
        else:
            return LaySerializer(instance=instance).data

    class Meta:
        model = Person
        fields = '__all__'


class ReligiousSerializer(serializers.ModelSerializer):
    class Meta:
        model = Religious
        fields = '__all__'
        depth = 2


class LaySerializer(serializers.ModelSerializer):
    class Meta:
        model = Person
        fields = '__all__'


class ClergySerializer(serializers.ModelSerializer):
    class Meta:
        model = Clergy
        fields = '__all__'
        depth = 2

The "switch" happens in the to_representation method of the main serializer (PersonListSerializer). It looks at the instance type, and then "injects" the needed serializer. Since Clergy, Religious are all inherited from Person getting back a Person that is also a Clergy member, returns all the Person fields and all the Clergy fields. Same goes for Religious. And if the Person is neither Clergy or Religious - the base model fields are only returned.

Not sure if this is the proper approach - but it seems very flexible, and fits my usecase. Note that I save/update/create Person thru different views/serializers - so I don't have to worry about that with this type of setup.



回答2:

I was able to do this by creating a custom relatedfield

class MyBaseModelField(serializers.RelatedField):
    def to_native(self, value):
        if isinstance(value, ModelA):
            a_s = ModelASerializer(instance=value)
            return a_s.data
        if isinstance(value, ModelB):
            b_s = ModelBSerializer(instance=value)
            return b_s.data

        raise NotImplementedError


class OtherModelSerializer(serializer.ModelSerializer):
    mybasemodel_set = MyBaseModelField(many=True)

    class Meta:
        model = OtherModel
        fields = # make sure we manually include the reverse relation (mybasemodel_set, )

I do have concerns that instanting a Serializer for each object is the reverse relation queryset is expensive so I'm wondering if there is a better way to do this.

Another approach i tried was dynamically changing the model field on MyBaseModelSerializer inside of __init__ but I ran into the issue described here:
django rest framework nested modelserializer



回答3:

I'm attempting to use a solution that involves different serializer subclasses for the different model subclasses:

class MyBaseModelSerializer(serializers.ModelSerializer):

    @staticmethod
    def _get_alt_class(cls, args, kwargs):
        if (cls != MyBaseModel):
            # we're instantiating a subclass already, use that class
            return cls

        # < logic to choose an alternative class to use >
        # in my case, I'm inspecting kwargs["data"] to make a decision
        # alt_cls = SomeSubClass

        return alt_cls

    def __new__(cls, *args, **kwargs):
        alt_cls = MyBaseModel.get_alt_class(cls, args, kwargs)
        return super(MyBaseModel, alt_cls).__new__(alt_cls, *args, **kwargs)

    class Meta:
        model=MyBaseModel

class ModelASerializer(MyBaseModelSerializer):
    class Meta:
        model=ModelA

class ModelBSerializer(MyBaseModelSerializer):
    class Meta:
        model=ModelB

That is, when you try and instantiate an object of type MyBaseModelSerializer, you actually end up with an object of one of the subclasses, which serialize (and crucially for me, deserialize) correctly.

I've just started using this, so it's possible that there are problems I've not run into yet.