Python dictionary loaded from disk takes too much

2019-05-05 21:02发布

I have a dictionary pickled on disk with size of ~780 Megs (on disk). However, when I load that dictionary into the memory, its size swells unexpectedly to around 6 gigabytes. Is there anyway to keep the size around the actual filesize in the memory as well, (I mean it will be alright if it takes around 1 gigs in the memory, but 6 gigs is kind of a strange behavior). Is there a problem with the pickle module, or should I save the dictionary in some other format?

Here is how I am loading the file:

import pickle

with open('py_dict.pickle', 'rb') as file:
    py_dict = pickle.load(file)

Any ideas, help, would be greatly appreciated.

2条回答
ゆ 、 Hurt°
2楼-- · 2019-05-05 21:28

If you're using pickle just for storing large values in a dictionary, or a very large number of keys, you should consider using shelve instead.

import shelve
s=shelve.open('shelve.bin')
s['a']='value'

This loads each key/value only as needed, keeping the rest on disk

查看更多
Root(大扎)
3楼-- · 2019-05-05 21:32

Use SQL to store all the data into a database and use efficient queries to reach it.

查看更多
登录 后发表回答