Keras custom metric gives incorrect tensor shape

2019-07-25 19:55发布

问题:

I want to monitor the dimension of y_pred by defining my own custom metric (using the Theano backend)

def shape_test(y_true, y_pred):
    return K.shape(y_pred)[0]

I was assuming that the dimension of y_pred in the custom metric function is equal to the mini batch size. However, I get weird output. See a small reproducible example below.

#imports and definitions
import numpy
numpy.random.seed(1234)
import keras.backend as K
from keras.models import Sequential
from keras.layers import Dense
from keras.optimizers import SGD

neuron_num=20
dim_input=2
#batch size will be important below!
batch_size=2048
TT=int(1e4)

#sample data
X=numpy.random.randn(TT,dim_input)
eps=numpy.random.randn(TT)
Y=0.3*X[:,0]+0.5*X[:,1]+eps
x={"is":X[:(TT/2),:],"os":X[(TT/2+1):,:]}
y={"is":Y[:(TT/2)],"os":Y[(TT/2+1):]}

This is the custom metric as given above

def shape_test(y_true, y_pred):
    return K.shape(y_pred)[0]

Now define a simple NN

sgd=SGD(lr=1e-2,nesterov=True)

model=Sequential()
model.add(Dense(neuron_num,
                input_dim=x["is"].shape[1],
                init="glorot_normal",
                activation="tanh"))
model.add(Dense(neuron_num,init="glorot_normal",activation="tanh"))
model.add(Dense(1,init="glorot_normal",activation="linear"))
model.compile(loss="mean_squared_error",
              optimizer=sgd,
              metrics=["mean_squared_error",shape_test])

model.fit(x["is"],
          y["is"],
          validation_data=(x["os"],y["os"]),
          nb_epoch=1,
          batch_size=batch_size,
          verbose=False).history

This gives

#{'loss': [1.834826689338684],
# 'mean_squared_error': [1.834826689338684],
# 'shape_test': [1841],
# 'val_loss': [1.4931119817522769],
# 'val_mean_squared_error': [1.4931119817522769],
# 'val_shape_test': [1841.1716343268654]}

I would have expected to see 'shape_test': [2048] instead of 'shape_test': [1841], as the batch size is 2048.

This seems very weird. Is this possibly a bug? I am using Python 2.7.6, Keras==1.0.8, Theano==0.8.2 and the CPU.

回答1:

Using neuron_num=2000 and verbose=True, here is what I was able to produce with your example:

Epoch 1/1
2048/5000 [========>............] - ETA: 9s - loss: 1.4507 - shape_test: 2048.000
4096/5000 [=================>...] - ETA: 3s - loss: 1.3577 - shape_test: 2048.000
5000/5000 [=====================] - 26s - loss: 1.3087 - shape_test: 1841.1648 - val_shape_test: 1841.1716

As you can see, your shape function seems to work fine. But since batch_size is not a divisor of the training set size, the last batch only contains 904 examples. I can't seem to be able to guess how Keras comes up with 1841 at the minute, but it's probably not complicated.

Another try with batch_size=2500 looks better:

2500/5000 [==========>..........] - ETA: 9s - loss: 1.4292 - shape_test: 2500.0000
5000/5000 [=====================] - 24s - loss: 1.3311 - shape_test: 2500.0000 - val_shape_test: 2499.5001