Obtaining a prediction in Keras

2019-02-06 20:46发布

问题:

I have successfully trained a simple model in Keras to classify images:

model = Sequential()

model.add(Convolution2D(32, 3, 3, border_mode='valid', input_shape=(img_channels, img_rows, img_cols),
                        activation='relu', name='conv1_1'))
model.add(Convolution2D(32, 3, 3, activation='relu', name='conv1_2'))
model.add(MaxPooling2D(pool_size=(2, 2)))
model.add(Dropout(0.25))

model.add(Convolution2D(64, 3, 3, border_mode='valid', activation='relu', name='conv2_1'))
model.add(Convolution2D(64, 3, 3, activation='relu', name='conv2_2'))
model.add(MaxPooling2D(pool_size=(2, 2)))
model.add(Dropout(0.25))

model.add(Flatten())
model.add(Dense(512, activation='relu'))
model.add(Dropout(0.5))

model.add(Dense(nb_classes, activation='softmax'))

model.compile(loss='categorical_crossentropy', optimizer=sgd, metrics=['accuracy'])

I can also predict the image classes using

y_pred = model.predict_classes(img, 1, verbose=0)

However the output of y_pred is always binary. This also seems to be the case when using predict_proba and predict. My outputs are in this form

[[ 1.  0.  0.  0.]]
[[ 0.  1.  0.  0.]]

This works OK, but I'd like to have a probability percent for each classification, for example

[[ 0.8  0.1  0.1  0.4]]

How do I get this in Keras?

回答1:

Softmax might yield "one-hot" like output. Consider the following example:

# Input; Exponent; Softmax value 
20    485165195  0.99994
 9         8103  0.00002
 5          148  0.00000
10        22026  0.00005
------------------------
# Sum 485195473  1

Since the exponential function grows very fast softmax starts yielding one-hot like output starting from order of magnitude 1. In Keras implementation of the softmax function the maximum value is subtracted from the input, but in the stated above case it won't make any difference.

Possible ways to fix this:

  1. Make sure that input images are rescaled, so that pixels values are between 0 and 1.

  2. Add some regularizers to your model.