pathterminuspages/machine learning/aboutcontactabout me

Softmax

11.11.2020 | Standard Functions/Normalization

Contents/Index

@Softmax

Softmax is a normalization function. It is given as $$ \sigma(z)_i = \frac{e^{z_i}}{\sum_{j = 1}^{K} d^{z_j}} , i = 1 ... K, z = [z_1, ... , z_K] $$ That is we normalize a vector $z$ using the exponential of each value $z_i$. The result is a new vector with same dimension as the original, but where the sum of elements equals 1, hence normalized. As a python example we have

import numpy as np def softmax(x): return np.exp(x) / np.sum(np.exp(x)) x = np.empty(5,dtype=float) x_norm = softmax(x) print(x) print(x_norm) print(sum(x_norm))

With the following results

[1.96641684e-316 0.00000000e+000 7.90505033e-323 7.90505033e-323 2.37151510e-322] [0.2 0.2 0.2 0.2 0.2] 1.0
CommentsGuest Name:Comment: