I have basic 2-D numpy arrays and I'd like to "downsample" them to a more coarse resolution. Is there a simple numpy or scipy module that can easily do this? I should also note that this array is being displayed geographically via Basemap modules.
SAMPLE:
scikit-image
has implemented a working version ofdownsampling
here, although they shy away from calling itdownsampling
for it not being a downsampling in terms of DSP, if I understand correctly:http://scikit-image.org/docs/dev/api/skimage.measure.html#skimage.measure.block_reduce
but it works very well, and it is the only
downsampler
that I found in Python that can deal withnp.nan
in the image. I have downsampled gigantic images with this very quickly.Because the OP just wants a courser resolution, I thought I would share my way for reducing number of pixels by half in each dimension. I takes the mean of 2x2 blocks. This can be applied multiple times to reduce by factors of 2.
imresize and ndimage.interpolation.zoom look like they do what you want
I haven't tried imresize before but here is how I have used ndimage.interpolation.zoom
a is then a 4x4 matrix with interpolated values in it
When downsampling, interpolation is the wrong thing to do. Always use an aggregated approach.
I use block means to do this, using a "factor" to reduce the resolution.
E.g., a (100, 200) shape array using a factor of 5 (5x5 blocks) results in a (20, 40) array result:
This might not be what you're looking for, but I thought I'd mention it for completeness.
You could try installing
scikits.samplerate
(docs), which is a Python wrapper for libsamplerate. It provides nice, high-quality resampling algorithms -- BUT as far as I can tell, it only works in 1D. You might be able to resample your 2D signal first along one axis and then along another, but I'd think that might counteract the benefits of high-quality resampling to begin with.