Keras Convolution2D Input: Error when checking mod

2019-04-07 04:26发布

I am working through this great tutorial on creating an image classifier using Keras. Once I have trained the model, I save it to a file and then later reload it into a model in a test script shown below.

I get the following exception when I evaluate the model using a new, never-before-seen image:

Error:

Traceback (most recent call last):
  File "test_classifier.py", line 48, in <module>
    score = model.evaluate(x, y, batch_size=16)
  File "/Library/Python/2.7/site-packages/keras/models.py", line 655, in evaluate
    sample_weight=sample_weight)
  File "/Library/Python/2.7/site-packages/keras/engine/training.py", line 1131, in evaluate
    batch_size=batch_size)
  File "/Library/Python/2.7/site-packages/keras/engine/training.py", line 959, in _standardize_user_data
exception_prefix='model input')
  File "/Library/Python/2.7/site-packages/keras/engine/training.py", line 108, in standardize_input_data
str(array.shape))
Exception: Error when checking model input: expected convolution2d_input_1 to have shape (None, 3, 150, 150) but got array with shape (1, 3, 150, 198)`

Is the problem with the model that I have trained or with how I am invoking the evaluate method?

Code:

    from keras.preprocessing.image import ImageDataGenerator
    from keras.models import Sequential
    from keras.layers import Convolution2D, MaxPooling2D
    from keras.layers import Activation, Dropout, Flatten, Dense
    from keras.preprocessing.image import ImageDataGenerator, array_to_img, img_to_array, load_img

    import numpy as np
    img_width, img_height = 150, 150
    train_data_dir = 'data/train'
    validation_data_dir = 'data/validation'
    nb_train_samples = 2000
    nb_validation_samples = 800
    nb_epoch = 5
    model = Sequential()
    model.add(Convolution2D(32, 3, 3, input_shape=(3, img_width, img_height)))
    model.add(Activation('relu'))
    model.add(MaxPooling2D(pool_size=(2, 2)))
    model.add(Convolution2D(32, 3, 3))
    model.add(Activation('relu'))
    model.add(MaxPooling2D(pool_size=(2, 2)))
    model.add(Convolution2D(64, 3, 3))
    model.add(Activation('relu'))
    model.add(MaxPooling2D(pool_size=(2, 2)))
    model.add(Flatten())
    model.add(Dense(64))
    model.add(Activation('relu'))
    model.add(Dropout(0.5))
    model.add(Dense(1))
    model.add(Activation('sigmoid'))
    model.compile(loss='binary_crossentropy',
          optimizer='rmsprop',
          metrics=['accuracy'])
    model.load_weights('first_try.h5')
    img = load_img('data/test2/ferrari.jpeg')
    x = img_to_array(img)  # this is a Numpy array with shape (3, 150, 150)
    x = x.reshape( (1,) + x.shape )  # this is a Numpy array with shape (1, 3, 150, 150)
    y = np.array([0])
    score = model.evaluate(x, y, batch_size=16)`

3条回答
爷的心禁止访问
2楼-- · 2019-04-07 04:53

I have same problem and use this function: All images in target folder (.jpg and .png) will be resized to height and width. And divided by 255. Plus added 1 more dimension (required input shape).

from scipy import misc
import os

def readImagesAsNumpyArrays(targetPath, i_height, i_width):
    files = os.listdir(targetPath)
    npList = list()
    for file in files:
        if ".jpg" or ".png" in str(file):
            path = os.path.join(targetPath, file)
            img = misc.imread(path)
            img = misc.imresize(img, (i_height, i_width))
            img = img * (1. / 255)
            img = img[None, :, :,: ]
            npList.append(img)
    return npList
查看更多
▲ chillily
3楼-- · 2019-04-07 04:57

The issue was two-fold:

  1. The test image was the wrong size. It was 150 x 198, and needed to be 150 x 150.

  2. I had to change the dense layer from model.add(Dense(10)) to model.add(Dense(1)).

I don't yet understand how to get the model to give me the prediction, but at least now, the model evaluation runs.

查看更多
放我归山
4楼-- · 2019-04-07 05:16

The problem is due to wrong size of test images. For me,

train_datagen.flow_from_directory(
        'C:\\Users\\...\\train',  # this is the target directory
        target_size=(150, 150),  # all images will be resized to 150x150
        batch_size=32,
        class_mode='binary')

was not working properly. So I used a matlab command to resize all the test images and it worked fine

查看更多
登录 后发表回答