Keras model with high accuracy but poor prediction

2019-08-25 00:13发布

问题:

I am trying to train a neural network to make Inverse Kinematics calculations for a robotic arm with predefined segment lengths. I am not including the segment lengths in neural network inputs but rather through the training data. The training data is a pandas dataframe with the spatial mappings of the arm, with labels being the angles of rotation for the three segments of the arm and the features being the solutions of the x and y coordinates of where the endpoint of the last segment would end up in.

I am using Keras with Theano as the Backend.

model = Sequential([
Dense(3, input_shape=(2,), activation="relu"),
Dense(3, activation="relu"),
Dense(3)
])

model.summary()

model.compile(Adam(lr=0.001), loss='mean_squared_error', metrics=['accuracy'])
model.fit(samples, labels, validation_split=0.2, batch_size=1000, epochs=10,shuffle=True, verbose=1)

score = model.evaluate(samples, labels, batch_size=32, verbose=1)

print('Test score:', score[0])
print('Test accuracy:', score[1])

weights = model.get_weights()
predictions = model.predict(samples, verbose=1)
print predictions
model.save("IK_NN_7-4-3_keras.h5")

OUTPUT===============================================================


Train on 6272736 samples, validate on 1568184 samples
Epoch 1/10
 - 5s - loss: 10198.7558 - acc: 0.9409 - val_loss: 12149.1703 - val_acc: 0.9858
Epoch 2/10
 - 5s - loss: 4272.9105 - acc: 0.9932 - val_loss: 12117.0527 - val_acc: 0.9858
Epoch 3/10
 - 5s - loss: 4272.7862 - acc: 0.9932 - val_loss: 12113.3804 - val_acc: 0.9858
Epoch 4/10
 - 5s - loss: 4272.7567 - acc: 0.9932 - val_loss: 12050.8211 - val_acc: 0.9858
Epoch 5/10
 - 5s - loss: 4272.7271 - acc: 0.9932 - val_loss: 12036.5538 - val_acc: 0.9858
Epoch 6/10
 - 5s - loss: 4272.7350 - acc: 0.9932 - val_loss: 12103.8665 - val_acc: 0.9858
Epoch 7/10
 - 5s - loss: 4272.7553 - acc: 0.9932 - val_loss: 12175.0442 - val_acc: 0.9858
Epoch 8/10
 - 5s - loss: 4272.7282 - acc: 0.9932 - val_loss: 12161.4815 - val_acc: 0.9858
Epoch 9/10
 - 5s - loss: 4272.7213 - acc: 0.9932 - val_loss: 12101.4021 - val_acc: 0.9858
Epoch 10/10
 - 5s - loss: 4272.7909 - acc: 0.9932 - val_loss: 12152.4966 - val_acc: 0.9858
Test score: 5848.549130022683
Test accuracy: 0.9917127071823204
[[ 59.452095 159.26912  258.94424 ]
 [ 59.382706 159.41936  259.25183 ]
 [ 59.72419  159.69777  259.48584 ]
 ...
 [ 59.58721  159.33467  258.9603  ]
 [ 59.51745  159.69331  259.62595 ]
 [ 59.984367 160.5533   260.7689  ]]

Both the test accuracy and validation accuracy are seem good but they don't exactly reflect the reality. The predictions should have looked something like this

[[  0   0   0]
[  0   0   1]
[  0   0   2]
...
[358 358 359]
[358 359 359]
[359 359 359]]

Since I fed back the same features expecting to get the same labels. Instead I'm getting this numbers for some reason:

[[ 59.452095 159.26912  258.94424 ]
 [ 59.382706 159.41936  259.25183 ]
 [ 59.72419  159.69777  259.48584 ]
 ...
 [ 59.58721  159.33467  258.9603  ]
 [ 59.51745  159.69331  259.62595 ]
 [ 59.984367 160.5533   260.7689  ]]

Thank you for your time.

回答1:

First of all your metric is accuracy and you are predicting continuous values. You get predictions, but they don´t make any sense. Your problem is a regression and your metric is for classification. You could just use "MSE" "R²" or other regression metrics

from keras import metrics
model.compile(loss='mse', optimizer='adam', metrics=[metrics.mean_squared_error, metrics.mean_absolute_error])

Additionally you should consider increasing the number of neurons and if your input data is really only 2 dimensions, think about some shallow models, not ANNs. (SVM with gauss kernel e.g.)