Possible Duplicate:
Is there a memory efficient and fast way to load big json files in python?
So I have some rather large json encoded files. The smallest is 300MB, but this is by far the smallest. The rest are multiple GB, anywhere from around 2GB to 10GB+.
So I seem to run out of memory when trying to load the file with Python. I'm currently just running some tests to see roughly how long dealing with this stuff is going to take to see where to go from here. Here is the code I'm using to test:
from datetime import datetime
import json
print datetime.now()
f = open('file.json', 'r')
json.load(f)
f.close()
print datetime.now()
Not too surprisingly, Python gives me a MemoryError. It appears that json.load() calls json.loads(f.read()), which is trying to dump the entire file into memory first, which clearly isn't going to work.
Any way I can solve this cleanly?
I know this is old, but I don't think this is a duplicate. While the answer is the same, the question is different. In the "duplicate", the question is how to read large files efficiently, whereas this question deals with files that won't even fit in to memory at all. Efficiency isn't required.