I am trying to move my data from Cloud SQL to Cloud Datastore.
There are a bit under 5 million entries in the SQL database.
It seems like I can only move over 100,000 entities per day before I get a quota error.
I can't figure out which exact quota I'm exceeding, however I have exponential backoff to make sure I'm not sending it too fast.
Eventually it hits 5 minutes and the connection to the SQL server dies, but I don't think the writes per second quota is the problem. And I don't see any other quota exceeding in my APIs page or the App Engine API page.
I have tried two different APIs to write the records.
The GCP Datastore API
import googledatastore
Here is the code:
https://gist.github.com/nburn42/d8b488da1d2dc53df63f4c4a32b95def
And the Dataflow API
from apache_beam.io.gcp.datastore.v1.datastoreio import WriteToDatastore
Here is the code:
https://gist.github.com/nburn42/2c2a06e383aa6b04f84ed31548f1cb09
Here is the error I see after one or two hundred thousand good writes.
RPCError: datastore call commit [while running 'Write To Datastore/Write Mutation to Datastore'] failed: Error code: RESOURCE_EXHAUSTED. Message: Quota exceeded.
I'm running this on compute engine.
Any help is greatly appreciated!
Thanks,
Nathan
I asked for a quota increase and someone at google checked my account to find the problem.
Here is their reply.
This fixed the problem. I just needed to go into app engine and set a much higher daily spending limit.
Hopefully the code I included above will help others.