The “weights” of a neural network is referred as “parameters” in PyTorch code and it is fine-tuned by optimizer during training. On the contrary, hyperparameters are the parameters of a neural network that is fixed by design and not tuned by training. Examples are the number of hidden layers and the choice of activation functions. […]
The post How to Grid Search Hyperparameters for PyTorch Models appeared first on MachineLearningMastery.com.
The “weights” of a neural network is referred as “parameters” in PyTorch code and it is fine-tuned by optimizer during training. On the contrary, hyperparameters are the parameters of a neural network that is fixed by design and not tuned by training. Examples are the number of hidden layers and the choice of activation functions.
The post How to Grid Search Hyperparameters for PyTorch Models appeared first on MachineLearningMastery.com. Read More Deep Learning with PyTorch