🔄 What is an Epoch?
An epoch is one complete pass through the entire training dataset by the model.
If you have 1,000 samples and a batch size of 100:
One epoch = 10 batches (because 100 × 10 = 1,000).
After each epoch, the model has seen all training data once.
⚙️ Why Multiple Epochs?
A single epoch usually isn’t enough for the model to learn meaningful patterns.
Training for multiple epochs allows the model to gradually adjust weights and improve accuracy.
Too few epochs → underfitting (model hasn’t learned enough).
Too many epochs → overfitting (model memorizes training data, performs poorly on unseen data).
📌 Epochs vs. Batches vs. Iterations
| Batch | Subset of the dataset processed at once (e.g., 32 samples). |
| Iteration | One update step of weights (processing a single batch). |
| Epoch | One full pass through the dataset (all batches processed once). |
So:
Epochs = how many times the model sees the full dataset.
Iterations = how many times weights are updated.
Batches = how many samples are processed per iteration.
✅ Example
Dataset size = 10,000 samples
Batch size = 100
Epochs = 5
➡️ Each epoch = 100 iterations (10,000 ÷ 100). ➡️ Total training = 500 iterations (100 × 5).
In short: Epochs are the number of times the model cycles through the entire dataset during training.
No comments:
Post a Comment