Sunday, 16 November 2025

Deep learning Interview Question 01 : Batch Processing and Weight Updates

 If we train a model with 32 batches, where batches 1–32 result in weights of 0.2 and batches 33–64 result in weights of 0.3, will the model continue using the previously updated weights from the earlier batches, or will it start fresh with new weights for each batch range?

  • Batch 1–32 → weight = 0.2 The model processes these batches, computes gradients, and updates parameters. After this step, the model’s weights are no longer the initial ones — they’ve been adjusted to reflect learning from batches 1–32.

  • Batch 33–64 → weight = 0.3 When the model moves to the next set of batches, it does not reset to the old weights. Instead, it continues from the updated weights after batch 32. The new batches further refine the parameters.

⚙️ Key Principle

  • In training, the model always uses the latest weights (the ones updated after the previous batch).

  • It does not start fresh for each batch range unless you explicitly reinitialize the model.

  • So in your example, batches 33–64 will be processed using the weights that already include learning from batches 1–32.

📌 Analogy

Think of it like writing a book:

  • After chapters 1–32, you’ve already built the storyline (weights = 0.2).

  • When you write chapters 33–64, you don’t throw away the first half — you continue building on it (weights evolve to 0.3).

Answer: The model will always use the previously updated weights from the last batch. It does not start with a new model per batch unless you explicitly reset or reinitialize it.

No comments:

Post a Comment

Data Engineering - Client Interview question regarding data collection.

What is the source of data How the data will be extracted from the source What will the data format be? How often should data be collected? ...