I want to normalize the image data for deep learning.
I have training images.
I can make a numpy array from it but if I try to img/255
in the for loop when I make that array, it raise MemoryError.
Would you tell me how to normalize the array successfully?
---Added---
code:
import numpy as np
import cv2
import os
x_data = []
for name in sorted(os.listdir('SegmentationJPEG')):
im = cv2.imread(os.path.join('SegmentationJPEG', name))
im = cv2.resize(im, (572, 572))
im = cv2.cvtColor(im, cv2.COLOR_BGR2RGB)
x_data = np.asarray(x_data)
result:
MemoryError Traceback (most recent call last)
<ipython-input-16-2b9215277357> in <module>()
4 im = cv2.resize(im, (572, 572))
5 im = cv2.cvtColor(im, cv2.COLOR_BGR2RGB)
----> 6 x_data.append(im/255)
7 x_data = np.asarray(x_data)
MemoryError: