Neural network activation Function

Ryan
2 min readJul 29, 2021

--

Activation function outputs specific values when sum of inputs exceeds threshold. Step function, Sigmoid function, Tanh function, ReLu and Softmax function are most commonly used activation functions.

Step function

Step function has output signal similar to shape of steps. Its outputs changes from 0 to 1 when x=0

And it outputs this graph:

Sigmoid function

Sigmoid means resembling the latin letter S, also graph looks similar to S. Sigmoid function is defined as :

And it outputs this graph:

tanh Function

tanh is hyperbolic tangent, it outputs similar graph as sigmoid while sigmoid outputs 0 ~ 1 tanh outputs -1 ~ 1.
tanh is defined as :

And it outputs this graph:

ReLU Function

Sigmoid function was used in the field of neural networks, nowadays ReLU (Rectified linear unit) function is used more often. As R stands for rectified, ReLU function turns negative values into 0, therefore if input is below 0, outputs 0, if input exceeds 0, outputs given values.

And it outputs this graph:

Softmax

Softmax is activation function of a neural network to normalize the output of a network to a probability distribution over predicted output classes. it normalizes values into 0 ~ 1. Softmax can be defined as :

x = The input values
n = The number of classes in the multi-class classifier.

--

--

Ryan
Ryan

Written by Ryan

iOS engineer & data science enthusiast. Into algorithmic trading. https://github.com/Rsych

No responses yet