I've been pickling the objects to filesystem and reading them back when needed to work with those objects. Currently I've this code for that purpose.
def pickle(self, directory, filename):
if not os.path.exists(directory):
os.makedirs(directory)
with open(directory + '/' + filename, 'wb') as handle:
pickle.dump(self, handle)
@staticmethod
def load(filename):
with open(filename, 'rb') as handle:
element = pickle.load(handle)
return element
Now I'm moving my applictation(django) to Google app engine and figured that app engine does not allow me to write to file system. Google cloud storage seemed my only choice but I could not understand how could I pickle my objects as cloud storage objects and read them back to create the original python object.
You can use the Cloud Storage client library.
Instead of
open()
usecloudstorage.open()
(orgcs.open()
if importingcloudstorage
asgcs
, as in the above-mentioned doc) and note that the full filepath starts with the GCS bucket name (as a dir).More details in the cloudstorage.open() documentation.