Activation Functions

Activation Functions offer ways to introduce non-linearity. It takes a raw input value (often resulting from a linear combination of variables), and maps it to a specific output format. There are many activation functions. We list a popular few here.

Function Graph (Desmos)
Sigmoid / Logistic Function

σ(x)=11+ex

Domain: (,)
Range: (0,1)
Softmax

σ(z)i=ezij=1Kezj

Extends Sigmoid to multiple dimensions. Image is for two dimensions.
ReLU (Rectified Linear Unit)

f(x)=max(0,x)

Domain: (,)
Range: [0,)
Leaky ReLU

f(x)={xif x>0αxotherwise

Where 0<α<1 is a small constant, typically 0.01.

Domain: (,)
Range: (,)
Heaviside (Step Function)

H(x)={0if x<01if x0

Domain: (,)
Range: {0,1}
Powered by Forestry.md