I am currently using Django framework including its Models mechanism to abstract the database schema declaration and general db access, which is working fine for most scenarios.
However, my application also requires tables to be created and accessed dynamically during runtime, which as far as I can see, is not supported by Django out of the box.
These tables usually have an identical structure, and can basically be abstracted by the same Model class, but Django doesn't let you change the underlying db_table of a certain model query, as it is declared on the Model class and not on the Manager.
My solution for this is to do this process whenever I need a new table to be created, populated and accessed:
- Create and populate the table using raw sql
- Add indexes to the table using raw sql
When I need to access the table (using django queryset api), I declare a new type dynamically and return it as the model for the query, by using this code:
table_name = # name of the table created by sql model_name = '%d_%s' % (connection.tenant.id, table_name) try: model = apps.get_registered_model('myapp', model_name) return model except LookupError: pass logger.debug("no model exists for model %s, creating one" % model_name) class Meta: db_table = table_name managed = False attrs = { 'field1' : models.CharField(max_length=200), 'field2' : models.CharField(max_length=200), 'field3' : models.CharField(max_length=200) '__module__': 'myapp.models', 'Meta':Meta } model = type(str(model_name), (models.Model,), attrs) return model
Note that I do check if the model is already registered in django and I'm using an existing model in case it does. The model name is always unique for each table. Since I'm using multi tenants, the tenant name is also part of the model name to avoid conflict with similar tables declared on different schemas.
- In case it's not clear: the tables created dynamically will and should be persisted permanently for future sessions.
This solution works fine for me so far.
However, the application will need to support a large number of these tables. i.e. 10,000 - 100,000 such tables(and corresponding model classes), with up to a million rows per table.
Assuming the underlying db is fine with this load, my questions are:
Do you see any problem with this solution, with and without regards to the expected scale ?
Anybody has a better solution for this scenario ?
Thanks.