I am looking for how to resample a numpy array representing image data at a new size, preferably having a choice of the interpolation method (nearest, bilinear, etc.). I know there is
scipy.misc.imresize
which does exactly this by wrapping PIL's resize function. The only problem is that since it uses PIL, the numpy array has to conform to image formats, giving me a maximum of 4 "color" channels.
I want to be able to resize arbitrary images, with any number of "color" channels. I was wondering if there is a simple way to do this in scipy/numpy, or if I need to roll my own.
I have two ideas for how to concoct one myself:
- a function that runs
scipy.misc.imresize
on every channel separately - create my own using
scipy.ndimage.interpolation.affine_transform
The first one would probably be slow for large data, and the second one does not seem to offer any other interpolation method except splines.
I've recently just found an issue with scipy.ndimage.interpolation.zoom, which I've submitted as a bug report: https://github.com/scipy/scipy/issues/3203
As an alternative (or at least for me), I've found that scikit-image's skimage.transform.resize works correctly: http://scikit-image.org/docs/dev/api/skimage.transform.html#skimage.transform.resize
However it works differently to scipy's interpolation.zoom - rather than specifying a mutliplier, you specify the the output shape that you want. This works for 2D and 3D images.
For just 2D images, you can use transform.rescale and specify a multiplier or scale as you would with interpolation.zoom.
Based on your description, you want
scipy.ndimage.zoom
.Bilinear interpolation would be
order=1
, nearest isorder=0
, and cubic is the default (order=3
).zoom
is specifically for regularly-gridded data that you want to resample to a new resolution.As a quick example:
And the result:
Edit: As Matt S. pointed out, there are a couple of caveats for zooming multi-band images. I'm copying the portion below almost verbatim from one of my earlier answers:
Zooming also works for 3D (and nD) arrays. However, be aware that if you zoom by 2x, for example, you'll zoom along all axes.
This yields:
In the case of multi-band images, you usually don't want to interpolate along the "z" axis, creating new bands.
If you have something like a 3-band, RGB image that you'd like to zoom, you can do this by specifying a sequence of tuples as the zoom factor:
This yields:
Have you looked at Scikit-image? Its
transform.pyramid_*
functions might be useful for you.If you want to resample, then you should look at Scipy's cookbook for rebinning. In particular, the
congrid
function defined at the end will support rebinning or interpolation (equivalent to the function in IDL with the same name). This should be the fastest option if you don't want interpolation.You can also use directly
scipy.ndimage.map_coordinates
, which will do a spline interpolation for any kind of resampling (including unstructured grids). I find map_coordinates to be slow for large arrays (nx, ny > 200).For interpolation on structured grids, I tend to use
scipy.interpolate.RectBivariateSpline
. You can choose the order of the spline (linear, quadratic, cubic, etc) and even independently for each axis. An example:In this case you're doing a bi-linear interpolation
(kx = ky = 1)
. The 'nearest' kind of interpolation is not supported, as all this does is a spline interpolation over a rectangular mesh. It's also not the fastest method.If you're after bi-linear or bi-cubic interpolation, it is generally much faster to do two 1D interpolations:
You can also use
kind='nearest'
, but in that case get rid of the transverse arrays.