What are Epochs?
One Epoch is an ENTIRE dataset passed forwards and backward through the neural network.
Since one epoch is too large to feed to the computer at once, we divide it into several smaller batches.
We always use more than one Epoch because one epoch leads to underfitting. As the number of epochs increases, the weight is changed several times in the neural network, and the curve goes from underfitting up to optimal to overfitting curve.