keras fit with y=None with embedding layer

2020-08-01 06:07发布

问题:

I tried the following keras Sequential model:

model = Sequential()
model.add(Embedding(vocab_size, neurons, input_length=max_length))
model.add(Flatten())
model.compile(optimizer=Adam(lr=lr), loss='binary_crossentropy', metrics=['acc'])
print(model.summary())
return model

i.e.

_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
embedding (Embedding)        (None, 44, 300)           8100000   
_________________________________________________________________
flatten (Flatten)            (None, 13200)             0         
=================================================================
Total params: 8,100,000
Trainable params: 8,100,000
Non-trainable params: 0

which I fit like this:

encoded_lines = [keras.preprocessing.text.one_hot(d, vocab_size) for d in text_lines]
padded_lines = keras.preprocessing.sequence.pad_sequences(encoded_lines, maxlen=mymaxlength, padding='post')
model.fit(padded_lines, epochs=10)

where text_lines is an array of lines of text and I'm passing the padded one_hot encoded (I'm using this https://www.tensorflow.org/api_docs/python/tf/keras/preprocessing/text/one_hot) as training data and I pass nothing as target data. This trains happily, so my question is what does Keras use as target data? Against what target value is the accuracy computed? I'd guess the only possibility is simply that the target data is the same as the training data.

From the documentation of model.fit (https://keras.io/models/model/):

y: Numpy array of target (label) data (if the model has a single output), or list of Numpy arrays (if the model has multiple outputs). If output layers in the model are named, you can also pass a dictionary mapping output names to Numpy arrays. y can be None (default) if feeding from framework-native tensors (e.g. TensorFlow data tensors).

but this doesn't help me much.