I'm writing a simple service to take data from a couple of sources, munge it together, and use the Google API client to send it to a Google Sheet. Easy peasy works good, the data is not that big.
The issue is that calling .spreadsheets() after building the api service (i.e. build('sheets', 'v4', http=auth).spreadsheets()
) causes a memory jump of roughly 30 megabytes (I did some profiling to separate out where the memory was being allocated). When deployed to GAE, these spikes stick around for long stretches of time (hours at a time sometimes), creeping upwards and after several requests trigger GAE's 'Exceeded soft private memory limit' error.
I am using memcache for the discovery document and urlfetch for grabbing data, but those are the only other services I am using.
I have tried manual garbage collection, changing threadsafe in app.yaml, even things like changing the point at which .spreadsheets() is called, and can't shake this problem. It's also possible that I am simply misunderstanding something about GAE's architecture, but I know the spike is caused by the call to .spreadsheets() and I am not storing anything in local caches.
Is there a way either to 1) reduce the size of the memory spike from calling .spreadsheets() or 2) keep the spikes from staying around in memory (or preferably do both). A very simplified gist is below to give an idea of the API calls and request handler, I can give fuller code if needed. I know similar questions have been asked before, but I can't get it fixed.
https://gist.github.com/chill17/18f1caa897e6a20201232165aca05239
I ran into this when using the spreadsheets API on a small processor with only 20MB of usable RAM. The problem is the google API client pulls in the whole API in string format and stores it as a resource object in memory.
If free memory is an issue, you should construct your own http object and manually make the desired request. See my Spreadsheet() class as an example of how to create a new spreadsheet using this method.