FFT-based 2D convolution and correlation in Python

2019-01-13 16:20发布

问题:

Is there a FFT-based 2D cross-correlation or convolution function built into scipy (or another popular library)?

There are functions like these:

  • scipy.signal.correlate2d - "the direct method implemented by convolveND will be slow for large data"
  • scipy.ndimage.correlate - "The array is correlated with the given kernel using exact calculation (i.e. not FFT)."
  • scipy.fftpack.convolve.convolve, which I don't really understand, but seems wrong

numarray had a correlate2d() function with an fft=True switch, but I guess numarray was folded into numpy, and I can't find if this function was included.

回答1:

I found scipy.signal.fftconvolve, as also pointed out by magnus, but didn't realize at the time that it's n-dimensional. Since it's built-in and produces the right values, it seems like the ideal solution.

From Example of 2D Convolution:

In [1]: a = asarray([[ 1, 2, 3],
   ...:              [ 4, 5, 6],
   ...:              [ 7, 8, 9]])

In [2]: b = asarray([[-1,-2,-1],
   ...:              [ 0, 0, 0],
   ...:              [ 1, 2, 1]])

In [3]: scipy.signal.fftconvolve(a, b, mode = 'same')
Out[3]: 
array([[-13., -20., -17.],
       [-18., -24., -18.],
       [ 13.,  20.,  17.]])

Correct! The STSCI version, on the other hand, requires some extra work to make the boundaries correct?

In [4]: stsci.convolve2d(a, b, fft = True)
Out[4]: 
array([[-12., -12., -12.],
       [-24., -24., -24.],
       [-12., -12., -12.]])

(The STSCI method also requires compiling, which I was unsuccessful with (I just commented out the non-python parts), has some bugs like this and modifying the inputs ([1, 2] becomes [[1, 2]]), etc. So I changed my accepted answer to the built-in fftconvolve() function.)

Correlation, of course, is the same thing as convolution, but with one input reversed:

In [5]: a
Out[5]: 
array([[3, 0, 0],
       [2, 0, 0],
       [1, 0, 0]])

In [6]: b
Out[6]: 
array([[3, 2, 1],
       [0, 0, 0],
       [0, 0, 0]])

In [7]: scipy.signal.fftconvolve(a, b[::-1, ::-1])
Out[7]: 
array([[ 0., -0.,  0.,  0.,  0.],
       [ 0., -0.,  0.,  0.,  0.],
       [ 3.,  6.,  9.,  0.,  0.],
       [ 2.,  4.,  6.,  0.,  0.],
       [ 1.,  2.,  3.,  0.,  0.]])

In [8]: scipy.signal.correlate2d(a, b)
Out[8]: 
array([[0, 0, 0, 0, 0],
       [0, 0, 0, 0, 0],
       [3, 6, 9, 0, 0],
       [2, 4, 6, 0, 0],
       [1, 2, 3, 0, 0]])

and the latest revision has been sped up by using power-of-two sizes internally (and then I sped it up more by using real FFT for real input and using 5-smooth lengths instead of powers of 2 :D ).



回答2:

look at scipy.signal.fftconvolve, signal.convolve and signal.correlate (there is a signal.correlate2d but it seems to return an shifted array, not centered).



回答3:

I think you want the scipy.stsci package:

http://docs.scipy.org/doc/scipy/reference/stsci.html

In [30]: scipy.__version__
Out[30]: '0.7.0'

In [31]: from scipy.stsci.convolve import convolve2d, correlate2d


回答4:

I've lost track of the status of this package in scipy, but I know we include ndimage as part of the stsci_python release package as a convenience for our users:

http://www.stsci.edu/resources/software_hardware/pyraf/stsci_python/current/download

or you should be able pull it from the repository if you prefer:

https://www.stsci.edu/svn/ssb/stsci_python/stsci_python/trunk/ndimage/



回答5:

I wrote a cross-correlation/convolution wrapper that takes care of padding & nans and includes a simple smooth wrapper here. It's not a popular package, but it also has no dependencies besides numpy (or fftw for faster ffts).

I've also implemented an FFT speed testing code here in case anyone's interested. It shows - surprisingly - that numpy's fft is faster than scipy's, at least on my machine.

EDIT: moved code to N-dimensional version here