I want to find the energy of the image frame. This is how I calculated in Matlab.
[~,LH,HL,HH] = dwt2(rgb2gray(maskedImage),'db1'); % applying dwt
E = HL.^2 + LH.^2 + HH.^2; % Calculating the energy of each pixel.
Eframe = sum(sum(E))/(m*n); % m,n row and columns of image.
when I programmed in python for the same image, the value of Energy is shown 170 for expected 0.7 where did my program go wrong please suggest
#!usr/bin/python
import numpy as np
import cv2
import pywt
im = cv2.cvtColor(maskedimage,cv2.COLOR_BGR2GRAY)
m,n = im.shape
cA, (cH, cV, cD) = pywt.dwt2(im,'db1')
# a - LL, h - LH, v - HL, d - HH as in matlab
cHsq = [[elem * elem for elem in inner] for inner in cH]
cVsq = [[elem * elem for elem in inner] for inner in cV]
cDsq = [[elem * elem for elem in inner] for inner in cD]
Energy = (np.sum(cHsq) + np.sum(cVsq) + np.sum(cDsq))/(m*n)
print Energy
The problem with your analysis is that numpy arrays and MATLAB matrices are in different orders (by default). The first dimension of a 2D numpy array is the row, while the first dimension of a 2D MATLAB matrix is the column. The
dwt2
function is dependent on this order. So in order to get the same output ofdwt2
, you need to transpose the numpy array before using it.Further,
dwt2
outputs numpy arrays, not lists, so you can just do mathematical operations directly on them like you do in MATLAB.Further, you can get the total size of the image with
size
, saving you from having to multiplym
andn
.So this should be give equivalent results to MATLAB, assuming you have the order of the color channels correct (BGR vs. RGB):