#71 Machine Learning & Data Science Challenge 71

#71 Machine Learning & Data Science Challenge 71

List down hyperparameter tuning in Deep Learning.

The process of setting the hyper-parameters requires expertise and extensive trial and error.

  • There are no simple and easy ways to set hyper-parameters — specifically, learning rate, batch size, momentum, and weight decay.

Approaches to searching for the best configuration:

• Grid Search

• Random Search

Approach:

  1. Observe and understand the clues available during training by monitoring validation/test loss early in training, and tune your architecture and hyper-parameters with short runs of a few epochs.

  2. Signs of underfitting or overfitting of the test or validation loss early in the training process are useful for tuning the hyperparameters.

Tools for Optimizing Hyperparameters:

• Sage Maker

Comet.ml

• Weights &Biases

• Deep Cognition

• Azure ML