pickling python objects to google cloud storage

2019-03-02 12:34发布

I've been pickling the objects to filesystem and reading them back when needed to work with those objects. Currently I've this code for that purpose.

def pickle(self, directory, filename):
    if not os.path.exists(directory):
        os.makedirs(directory)
    with open(directory + '/' + filename, 'wb') as handle:
        pickle.dump(self, handle)

@staticmethod
def load(filename):
    with open(filename, 'rb') as handle:
        element = pickle.load(handle)
    return element

Now I'm moving my applictation(django) to Google app engine and figured that app engine does not allow me to write to file system. Google cloud storage seemed my only choice but I could not understand how could I pickle my objects as cloud storage objects and read them back to create the original python object.

1条回答
Rolldiameter
2楼-- · 2019-03-02 13:20

You can use the Cloud Storage client library.

Instead of open() use cloudstorage.open() (or gcs.open() if importing cloudstorage as gcs, as in the above-mentioned doc) and note that the full filepath starts with the GCS bucket name (as a dir).

More details in the cloudstorage.open() documentation.

查看更多
登录 后发表回答