1
Answer

What is Activation Function in NN

Activation Function

Answers (1)

0
Photo of Nikhil Tomar
1.4k 252 1 1w

Activation functions in Neural Networks (NNs) play a crucial role in introducing non-linearity to the model. They help the network learn complex relationships between inputs and outputs.

Common activation functions include:

1. *Sigmoid*: Maps inputs to a value between 0 and 1, useful for binary classification.
2. *ReLU (Rectified Linear Unit)*: Maps negative values to 0 and positive values to the same value, widely used in hidden layers.
3. *Tanh (Hyperbolic Tangent)*: Similar to sigmoid, but maps inputs to a value between -1 and 1.
4. *Softmax*: Often used in output layers for multi-class classification, as it outputs probabilities for each class.

Activation functions help the network:

1. Learn non-linear relationships
2. Avoid linear stacking of layers
3. Improve training and convergence

Would you like more details or specific information on activation functions?

Next Recommended Forum