pathterminuspages/machine learning/aboutcontactabout me

ReLU

19.11.2020 | Standard Functions/Activation

Contents/Index

Sigmoid
@ReLU

The ReLU function is a very basic activation functions. It is defined as $$ ReLU(x) = max(0,x) $$ That is $x$ if $x \geq 0$, else $0$. The plot of the function is showed in Figure 1.

Figure 1: Plot of the ReLU function

The following Python3 code has been used for the plot:

def relu(x): return max(0,x) xs = range(-10,11) ys = [relu(x) for x in xs] plt.plot(xs,ys) plt.show()

It is quite simple to differentiate, we have that $$ ReLU'(x)= \begin{cases} 0 & x \leq 0 \\ 1 & otherwise \end{cases} $$

CommentsGuest Name:Comment: