What is Hyperparameter?

Hyperparameter — A parameter whose value is used to control the learning process of an AI model.

Hyperparameters are configuration settings chosen before training begins — learning rate, batch size, number of layers, dropout rate. Unlike model parameters (weights) which are learned automatically, hyperparameters must be set by the developer and significantly impact training outcomes.

Frequently Asked Questions

How do I choose the right hyperparameters?

Start with established defaults for your model type. Then use systematic tuning methods like grid search, random search, or Bayesian optimization to find optimal values for your specific task.

What is the most important hyperparameter?

Learning rate. Too high and training becomes unstable. Too low and training is painfully slow. Most other hyperparameters are secondary to getting the learning rate right.

Can hyperparameter tuning be automated?

Yes. Tools like Optuna, Ray Tune, and Weights & Biases Sweeps automate the search process. AutoML platforms handle hyperparameter tuning as part of end-to-end model training.

← Back to Glossary

Enterprise Diagnostics

Where does your
organization stand?

Take our comprehensive 5-minute readiness assessment to uncover critical gaps across Strategy, Data, Infrastructure, Governance, and Workforce.