What is Epoch?
Epoch — One complete pass of the training dataset through the machine learning algorithm.
One epoch means the model has seen every example in the training dataset once. Training typically runs for multiple epochs — 3 to 10 for fine-tuning, potentially hundreds for training from scratch. Too many epochs risks overfitting; too few risks underfitting.
Frequently Asked Questions
How many epochs should I train for?
There is no universal answer. Monitor validation loss — stop training when it stops improving. For LLM fine-tuning, 1-5 epochs is typical. For smaller models, 10-100 epochs may be needed.
What happens if I train too many epochs?
The model overfits — it memorizes training data instead of learning general patterns. Performance on new data will degrade even as training metrics continue improving.
Is one epoch enough?
Sometimes. For very large datasets, a single pass may be sufficient. Many large language models are trained for less than one epoch over their massive training corpora.