I have a Python script that imports a large CSV file and then counts the number of occurrences of each word in the file, then exports the counts to another CSV file.
But what is happening is that once that counting part is finished and the exporting begins it says Killed
in the terminial.
I don't think this is a memory problem (if it was I assume I would be getting a memory error and not Killed
).
Could it be that the process is taking too long? If so, is there a way to extend the time-out period so I can avoid this?
Here is the code:
csv.field_size_limit(sys.maxsize)
counter={}
with open("/home/alex/Documents/version2/cooccur_list.csv",'rb') as file_name:
reader=csv.reader(file_name)
for row in reader:
if len(row)>1:
pair=row[0]+' '+row[1]
if pair in counter:
counter[pair]+=1
else:
counter[pair]=1
print 'finished counting'
writer = csv.writer(open('/home/alex/Documents/version2/dict.csv', 'wb'))
for key, value in counter.items():
writer.writerow([key, value])
And the Killed
happens after finished counting
has printed, and the full message is:
killed (program exited with code: 137)
I doubt anything is killing the process just because it takes a long time. Killed generically means something from the outside terminated the process, but probably not in this case hitting Ctrl-C since that would cause Python to exit on a KeyboardInterrupt exception. Also, in Python you would get MemoryError exception if that was the problem. What might be happening is you're hitting a bug in Python or standard library code that causes a crash of the process.
There are two storage areas involved: the stack and the heap.The stack is where the current state of a method call is kept (ie local variables and references), and the heap is where objects are stored. recursion and memory
I gues there are too many keys in the
counter
dict that will consume too much memory of the heap region, so the Python runtime will raise a OutOfMemory exception.To save it, don't create a giant object, e.g. the counter.
1.StackOverflow
a program that create too many local variables.
2.OutOfMemory
a program that creats a giant
dict
includes too many keys.References
Exit code 137 (128+9) indicates that your program exited due to receiving signal 9, which is
SIGKILL
. This also explains thekilled
message. The question is, why did you receive that signal?The most likely reason is probably that your process crossed some limit in the amount of system resources that you are allowed to use. Depending on your OS and configuration, this could mean you had too many open files, used too much filesytem space or something else. The most likely is that your program was using too much memory. Rather than risking things breaking when memory allocations started failing, the system sent a kill signal to the process that was using too much memory.
As I commented earlier, one reason you might hit a memory limit after printing
finished counting
is that your call tocounter.items()
in your final loop allocates a list that contains all the keys and values from your dictionary. If your dictionary had a lot of data, this might be a very big list. A possible solution would be to usecounter.iteritems()
which is a generator. Rather than returning all the items in a list, it lets you iterate over them with much less memory usage.So, I'd suggest trying this, as your final loop:
Note that in Python 3,
items
returns a "dictionary view" object which does not have the same overhead as Python 2's version. It replacesiteritems
, so if you later upgrade Python versions, you'll end up changing the loop back to the way it was.