Mismatch in expected Keras shapes after pooling

2019-07-23 16:05发布

I'm building a few simple models in Keras to improve my knowledge of deep learning, and encountering some issues I don't quite understand how to debug.

I want to use a 1D CNN to perform regression on some time-series data. My input feature tensor is of shape N x T x D, where N is the number of data points, T is the number of sequences, and D is the number of dimensions. My target tensor is of shape N x T x 1 (1 because I am trying to output a scalar value).

I've set up my model architecture like this:

feature_tensor.shape
# (75584, 40, 38)
target_tensor.shape
# (75584, 40, 1)

inputs = Input(shape=(SEQUENCE_LENGTH,DIMENSIONS))
conv1 = Conv1D(filters=64, kernel_size=3, activation='relu')
x = conv1(inputs)
x = MaxPooling1D(pool_size=2)(x)
x = Flatten()(x)
x = Dense(100, activation='relu')(x)
predictions = Dense(1, activation="linear")(x)
model = Model(inputs, predictions)
opt = Adam(lr=1e-5, decay=1e-4 / 200)
model.compile(loss="mean_absolute_error", optimizer=opt)

When I attempt to train my model, however, I get the following output:

r = model.fit(cleaned_tensor, target_tensor, epochs=100, batch_size=2058)

ValueError: Error when checking target: expected dense_164 to have 2 dimensions, but got array with shape (75584, 40, 1).

The first two numbers are familiar: 75584 is the # of samples, 40 is the sequence length.

When I debug my model summary object, I see that the expected output from the Flatten layer should be 1216:

enter image description here

However, my colleague and I stared at the code for a long time and could not understand why the shape of (75584, 40, 1) was being arrived at via the architecture when it reached the dense layer.

Could someone point me in the direction of what I am doing wrong?

1条回答
相关推荐>>
2楼-- · 2019-07-23 16:34

Try reshaping your target variable to N x T, and it looks like your final dense layer should be 40 rather than 1 (i think).

查看更多
登录 后发表回答