What is Overfitting?

Overfitting — When an AI model learns its training data too well, failing to generalize to new, unseen data.

An overfitted model memorizes training examples instead of learning general patterns. It performs perfectly on training data but poorly on new data. Common signs include near-perfect training accuracy with significantly lower validation accuracy. Regularization and larger datasets help prevent it.

Frequently Asked Questions

How do I detect overfitting?

Compare training performance vs. validation performance. A large gap — where training accuracy is high but validation accuracy is low — is the classic indicator of overfitting.

What causes overfitting?

Training too long on too little data, using an overly complex model for the task, or having noisy/mislabeled training data. Each allows the model to memorize instead of generalize.

How do I prevent overfitting?

Use more training data, apply regularization (dropout, weight decay), stop training early when validation performance plateaus, and use data augmentation techniques.

← Back to Glossary

Enterprise Diagnostics

Where does your
organization stand?

Take our comprehensive 5-minute readiness assessment to uncover critical gaps across Strategy, Data, Infrastructure, Governance, and Workforce.