#58 Machine Learning & Data Science Challenge 58

#58 Machine Learning & Data Science Challenge 58

What is backward propagation?

Backpropagation is the essence of neural net training and this method of fine-tuning the weights of a neural net is based on the error rate obtained in the previous epoch.

  • Proper tuning of the weights allows us to reduce error rates and make the model reliable by increasing its generalization.

  • Backpropagation is a short form of "backward propagation of errors."

  • This is the standard method of training artificial neural networks.

  • This helps to calculate the gradient of a loss function concerning all the weights in the network

Advantages:

  • Backpropagation is fast, simple, and easy to program.

  • It has no parameters to tune apart from the numbers of input.

  • It is a flexible method as it does not require prior knowledge of the network

  • It is the standard method that generally works well.

  • It does not need any special mention of the features of the function to be learned.