I am using Pillow and numpy, but have a problem with conversion between Pillow Image object and numpy array.
when I execute following code, the result is weird.
im = Image.open(os.path.join(self.img_path, ifname))
print im.size
in_data = np.asarray(im, dtype=np.uint8)
print in_data.shape
result is
(1024, 768)
(768, 1024)
Why dimension is changed?
actually this is because most image libraries give you images that are transpozed compared to numpy arrays. this is (i think) because you write image files line by line, so the first index (let's say
x
) refers to the line number (sox
is the vertical axis) and the second index (y
) refers to the subsequent pixel in line (soy
is the horizontal axis), which is against our everyday coordinates sense.If you want to handle it correctly you need to remember to write:
and consequently:
Which works also for 3D images. But i'm not promising this is the case for ALL image libraries - just these i worked with.
If your image is greyscale do:
but if you are working with rbg images you want to make sure your transpose operation is along only two axis:
im maybe column-major while arrays in numpy are row-major
do
in_data = in_data.T
to transpose the python arrayprobably should check in_data with
matplotlib
'simshow
to make sure the picture looks right.But do you know that matplotlib comes with its own loading functions that gives you numpy arrays directly? See: http://matplotlib.org/users/image_tutorial.html