Epoch vs Iteration when training neural networks

2020-01-25 12:22发布

What is the difference between epoch and iteration when training a multi-layer perceptron?

12条回答
家丑人穷心不美
2楼-- · 2020-01-25 12:39

1.Epoch is 1 complete cycle where Neural network has seen all he data.
2. One might have say 100,000 images to train the model, however memory space might not be sufficient to process all the images at once, hence we split training the model on smaller chunks of data called batches. e.g. batch size is 100.
3. We need to cover all the images using multiple batches. So we will need 1000 iterations to cover all the 100,000 images. (100 batch size * 1000 iterations)
4. Once Neural Network looks at entire data it is called 1 Epoch (Point 1). One might need multiple epochs to train the model. (let us say 10 epochs).

查看更多
3楼-- · 2020-01-25 12:43

I believe iteration is equivalent to a single batch forward+backprop in batch SGD. Epoch is going through the entire dataset once (as someone else mentioned).

查看更多
淡お忘
4楼-- · 2020-01-25 12:43

Epoch and iteration describe different things.


Epoch

An epoch describes the number of times the algorithm sees the entire data set. So, each time the algorithm has seen all samples in the dataset, an epoch has completed.

Iteration

An iteration describes the number of times a batch of data passed through the algorithm. In the case of neural networks, that means the forward pass and backward pass. So, every time you pass a batch of data through the NN, you completed an iteration.


Example

An example might make it clearer.

Say you have a dataset of 10 examples (or samples). You have a batch size of 2, and you've specified you want the algorithm to run for 3 epochs.

Therefore, in each epoch, you have 5 batches (10/2 = 5). Each batch gets passed through the algorithm, therefore you have 5 iterations per epoch. Since you've specified 3 epochs, you have a total of 15 iterations (5*3 = 15) for training.

查看更多
We Are One
5楼-- · 2020-01-25 12:49

epoch

A full training pass over the entire dataset such that each example has been seen once. Thus, an epoch represents N/batch size training iterations, where N is the total number of examples.

iteration

A single update of a model's weights during training. An iteration consists of computing the gradients of the parameters with respect to the loss on a single batch of data.

as bonus:

batch

The set of examples used in one iteration (that is, one gradient update) of model training.

See also batch size.

source: https://developers.google.com/machine-learning/glossary/

查看更多
祖国的老花朵
6楼-- · 2020-01-25 12:51

Typically, you'll split your test set into small batches for the network to learn from, and make the training go step by step through your number of layers, applying gradient-descent all the way down. All these small steps can be called iterations.

An epoch corresponds to the entire training set going through the entire network once. It can be useful to limit this, e.g. to fight overfitting.

查看更多
smile是对你的礼貌
7楼-- · 2020-01-25 12:53

Many neural network training algorithms involve making multiple presentations of the entire data set to the neural network. Often, a single presentation of the entire data set is referred to as an "epoch". In contrast, some algorithms present data to the neural network a single case at a time.

"Iteration" is a much more general term, but since you asked about it together with "epoch", I assume that your source is referring to the presentation of a single case to a neural network.

查看更多
登录 后发表回答