Why do we need Non-linear activation functions?
A neural network without activation functions is essentially a linear regression model. The activation functions do the non-linear transformation to the input, making it capable of learning and performing more complex tasks.
Identity
Binary Step
Sigmoid
Tanh
ReLU
Leaky ReLU
Softmax
- The activation functions do the non-linear transformation to the input, making it capable of learning and performing more complex tasks.