How is the training accuracy in Keras determined f

2019-04-11 14:29发布

问题:

I am training a model in Keras with as follows:

model.fit(Xtrn, ytrn batch_size=16, epochs=50, verbose=1, shuffle=True,
          callbacks=[model_checkpoint], validation_data=(Xval, yval))

The fitting output looks as follows:

As shown in the model.fit I have a batch size of 16 and a total of 8000 training samples as shown in the output. So from my understanding, training takes place every 16 batches. Which also means training is ran 500 times for a single epoch (i.e., 8000/16 =500)

So let's take the training accuracy printed in the output for Epoch 1/50, which in this case is 0.9381. I would like to know how is this training accuracy of 0.9381 derived.

Is it the:

i) the mean training accuracy taken as the average from the 500 times training is performed for every batch ?

OR,

ii) is it the best (or max) training accuracy from out of the 500 instances the training procedure is ran?

回答1:

Take a look at the BaseLogger in Keras where they're computing a running mean. For each epoch the accuracy is the average of all the batches seen before in that epoch.

class BaseLogger(Callback):
    """Callback that accumulates epoch averages of metrics.

    This callback is automatically applied to every Keras model.
    """

    def on_epoch_begin(self, epoch, logs=None):
        self.seen = 0
        self.totals = {}

    def on_batch_end(self, batch, logs=None):
        logs = logs or {}
        batch_size = logs.get('size', 0)
        self.seen += batch_size

        for k, v in logs.items():
            if k in self.totals:
                self.totals[k] += v * batch_size
            else:
                self.totals[k] = v * batch_size

    def on_epoch_end(self, epoch, logs=None):
        if logs is not None:
            for k in self.params['metrics']:
                if k in self.totals:
                    # Make value available to next callbacks.
                    logs[k] = self.totals[k] / self.seen