Sunday, 16 November 2025

Deep Learning 10 : What is an Epoch?

 

🔄 What is an Epoch?

  • An epoch is one complete pass through the entire training dataset by the model.

  • If you have 1,000 samples and a batch size of 100:

    • One epoch = 10 batches (because 100 × 10 = 1,000).

  • After each epoch, the model has seen all training data once.

⚙️ Why Multiple Epochs?

  • A single epoch usually isn’t enough for the model to learn meaningful patterns.

  • Training for multiple epochs allows the model to gradually adjust weights and improve accuracy.

  • Too few epochs → underfitting (model hasn’t learned enough).

  • Too many epochs → overfitting (model memorizes training data, performs poorly on unseen data).

📌 Epochs vs. Batches vs. Iterations

Term Meaning:

BatchSubset of the dataset processed at once (e.g., 32 samples).
IterationOne update step of weights (processing a single batch).
EpochOne full pass through the dataset (all batches processed once).

So:

  • Epochs = how many times the model sees the full dataset.

  • Iterations = how many times weights are updated.

  • Batches = how many samples are processed per iteration.

✅ Example

  • Dataset size = 10,000 samples

  • Batch size = 100

  • Epochs = 5

➡️ Each epoch = 100 iterations (10,000 ÷ 100). ➡️ Total training = 500 iterations (100 × 5).

In short: Epochs are the number of times the model cycles through the entire dataset during training.

No comments:

Post a Comment

Data Engineering - Client Interview question regarding data collection.

What is the source of data How the data will be extracted from the source What will the data format be? How often should data be collected? ...