How to find the "entropy" with imagemagick, preferably mini_magic, in Ruby? I need this as part of a larger project, finding "interestingness" in an image so to crop it.
I found a good example in Python/Django, which gives the following pseudo-code:
image = Image.open('example.png')
histogram = image.histogram() # Fetch a list of pixel counts, one for each pixel value in the source image
#Normalize, or average the result.
for each histogram as pixel
histogram_recalc << pixel / histogram.size
endfor
#Place the pixels on a logarithmic scale, to enhance the result.
for each histogram_recalc as pixel
if pixel != 0
entropy_list << log2(pixel)
endif
endfor
#Calculate the total of the enhanced pixel-values and invert(?) that.
entropy = entroy_list.sum * -1
This would translate to the formula entropy = -sum(p.*log2(p))
.
My questions: Did I interprete the Django/Python code correct? How can I fetch a histogram in ruby's mini_magick if at all?
Most important question: is this algorithm any good in the first place? Would you suggest a better one to find the "entropy" or "amount of changing pixels" or "gradient depth" in (parts of) images?
Edit: Using a.o. the resources provided by the answer below, I came up with the working code:
# Compute the entropy of an image slice.
def entropy_slice(image_data, x, y, width, height)
slice = image_data.crop(x, y, width, height)
entropy = entropy(slice)
end
# Compute the entropy of an image, defined as -sum(p.*log2(p)).
# Note: instead of log2, only available in ruby > 1.9, we use
# log(p)/log(2). which has the same effect.
def entropy(image_slice)
hist = image_slice.color_histogram
hist_size = hist.values.inject{|sum,x| sum ? sum + x : x }.to_f
entropy = 0
hist.values.each do |h|
p = h.to_f / hist_size
entropy += (p * (Math.log(p)/Math.log(2))) if p != 0
end
return entropy * -1
end
Where image_data is an RMagick::Image
.
This is used in the smartcropper gem, which allows smart slicing and cropping for images with e.g. paperclip.
Entropy is explained here (with MATLAB source, but hopefully the qualitative explanation helps):
Introduction to Entropy (Data Mining in MATLAB)
For a more formal explanation, see:
"Elements of Information Theory" (Chapter 2), by Cover and Thomas
With facets's
Array#entropy
: