Using Python/Numpy, I'm trying to import a file; however, the script returns an error that I believe is a memory error:
In [1]: import numpy as np
In [2]: npzfile = np.load('cuda400x400x2000.npz')
In [3]: U = npzfile['U']
---------------------------------------------------------------------------
SystemError Traceback (most recent call last)
<ipython-input-3-0539104595dc> in <module>()
----> 1 U = npzfile['U']
/usr/lib/pymodules/python2.7/numpy/lib/npyio.pyc in __getitem__(self, key)
232 if bytes.startswith(format.MAGIC_PREFIX):
233 value = BytesIO(bytes)
--> 234 return format.read_array(value)
235 else:
236 return bytes
/usr/lib/pymodules/python2.7/numpy/lib/format.pyc in read_array(fp)
456 # way.
457 # XXX: we can probably chunk this to avoid the memory hit.
--> 458 data = fp.read(int(count * dtype.itemsize))
459 array = numpy.fromstring(data, dtype=dtype, count=count)
460
SystemError: error return without exception set
If properly loaded, U
will contain 400*400*2000 doubles, so that's about 2.5 GB. It seems the system has enough memory available:
bogeholm@bananabot ~/Desktop $ free -m
total used free shared buffers cached
Mem: 7956 3375 4581 0 35 1511
-/+ buffers/cache: 1827 6128
Swap: 16383 0 16383
Is this a memory issue? Can it be fixed in any way other than buying more RAM? The box is Linux Mint DE with Python 2.7.3rc2 and Numpy 1.6.2.
Cheers,
\T