I have an application which imports 880 rows into an NDB datastore, using put_async(). Whenever I run this import it exceeds the daily quota of 50,000 write ops to the datastore.
I'm trying to understand why this operation is so expensive and what can be done to stay under quota.
There are 13 columns like so:
stringbool = ['true', 'false']
class BeerMenu(ndb.Model):
name = ndb.StringProperty()
brewery = ndb.StringProperty()
origin = ndb.StringProperty()
abv = ndb.FloatProperty()
size = ndb.FloatProperty()
meas = ndb.StringProperty()
price = ndb.FloatProperty()
active = ndb.StringProperty(default="false", choices=stringbool)
url = ndb.StringProperty()
bartender = ndb.StringProperty()
lineno = ndb.IntegerProperty()
purdate = ndb.DateProperty()
costper = ndb.FloatProperty()
I've trimmed the indexing back to one:
- kind: BeerMenu
properties:
- name: brewery
- name: name
According to the SDK datastore viewer, each row is 29 write ops, so that would generate 25520 writes! I'm assuming that the indexes consume the rest of the write ops, but I don't know exactly how many because AppEngine just says I've exceeded the quota.
What are the best strategies for reducing the number of write ops?