Keras model doesn't seem to work

2019-08-20 08:05发布

问题:

I have the following keras model and when I train the model, it doesn't seem to learn from it. I asked around and got different suggestions like weights are not initialised properly or back-propogation is not happening. The model is:

model.add(Conv2D(32, (3, 3), kernel_initializer='random_uniform', activation='relu', input_shape=(x1, x2, depth)))
model.add(MaxPool2D(pool_size=(2, 2)))

model.add(Conv2D(64, (3, 3), activation='relu'))
model.add(MaxPool2D(pool_size=(2, 2)))

model.add(Flatten())

model.add(Dense(128, activation='relu'))

model.add(Dense(3, activation='softmax'))

I even looked at this solution but I don't seem to have done that. I have softmax in the end. For your reference, I have the output of the training process:

Epoch 1/10
283/283 [==============================] - 1s 2ms/step - loss: 5.1041 - acc: 0.6254 - val_loss: 9.0664 - val_acc: 0.4375
Epoch 2/10
283/283 [==============================] - 0s 696us/step - loss: 4.9550 - acc: 0.6926 - val_loss: 9.0664 - val_acc: 0.4375
Epoch 3/10
283/283 [==============================] - 0s 717us/step - loss: 4.9550 - acc: 0.6926 - val_loss: 9.0664 - val_acc: 0.4375
Epoch 4/10
283/283 [==============================] - 0s 692us/step - loss: 4.9550 - acc: 0.6926 - val_loss: 9.0664 - val_acc: 0.4375
Epoch 5/10
283/283 [==============================] - 0s 701us/step - loss: 4.9550 - acc: 0.6926 - val_loss: 9.0664 - val_acc: 0.4375
Epoch 6/10
283/283 [==============================] - 0s 711us/step - loss: 4.9550 - acc: 0.6926 - val_loss: 9.0664 - val_acc: 0.4375
Epoch 7/10
283/283 [==============================] - 0s 707us/step - loss: 4.9550 - acc: 0.6926 - val_loss: 9.0664 - val_acc: 0.4375
Epoch 8/10
283/283 [==============================] - 0s 708us/step - loss: 4.9550 - acc: 0.6926 - val_loss: 9.0664 - val_acc: 0.4375
Epoch 9/10
283/283 [==============================] - 0s 703us/step - loss: 4.9550 - acc: 0.6926 - val_loss: 9.0664 - val_acc: 0.4375
Epoch 10/10
283/283 [==============================] - 0s 716us/step - loss: 4.9550 - acc: 0.6926 - val_loss: 9.0664 - val_acc

This is how I'm compiling it:

sgd = optimizers.SGD(lr=0.001, decay=1e-4, momentum=0.05, nesterov=True)

model.compile(loss='categorical_crossentropy',
              optimizer=sgd,
              metrics=['accuracy'])

Any suggestions? Something I'm missing? I have properly initialised the weights and keras seems to take care of backprop. What am I missing?

回答1:

I found the solution. I had to normalise/scale the images for proper training. It's now training properly. Here's the link that helped me with it.