Memory error when opening the file with read()

2019-08-17 03:43发布

I'm new to python and I'm editing a program where I need to open I file but it's more than 1.5 Gb so I get memory error. Code is:

f=open('thumbdata3.dat','rb')
tdata = f.read()
f.close()

ss = '\xff\xd8'
se = '\xff\xd9'

count = 0
start = 0
while True:
    x1 = tdata.find(ss,start)
    if x1 < 0:
        break
    x2 = tdata.find(se,x1)
    jpg = tdata[x1:x2+1]
    count += 1
    fname = 'extracted%d03.jpg' % (count)
    fw = open(fname,'wb')
    fw.write(jpg)
    fw.close()
    start = x2+2

So I get an

MemoryError

in

tdata = f.read()

section. How do I modify a function to split a file while being read?

标签: python file
1条回答
聊天终结者
2楼-- · 2019-08-17 04:25

From the description it seems that the memory footprint is the problem here. So we can use the generators to reduce the memory footprint of the data , so that it loads the part of data being used one by one.

from itertools import chain, islice

def piecewise(iterable, n):
    "piecewise(Python,2) => Py th on"
    iterable = iter(iterable)
    while True:
        yield chain([next(iterable)], islice(iterable, n-1))

l = ...
file_large = 'large_file.txt'
with open(file_large) as bigfile:
   for i, lines in enumerate(piecewise(bigfile, l)):
      file_split = '{}.{}'.format(file_large, i)
      with open(file_split, 'w') as f:
         f.writelines(lines)
查看更多
登录 后发表回答