deserialize(...): Returns activation function given a string identifier.
elu(...): Exponential Linear Unit.
exponential(...): Exponential activation function.
gelu(...): Applies the Gaussian error linear unit (GELU) activation function.
get(...): Returns function.
hard_sigmoid(...): Hard sigmoid activation function.
linear(...): Linear activation function (pass-through).
relu(...): Applies the rectified linear unit activation function.
selu(...): Scaled Exponential Linear Unit (SELU).
serialize(...): Returns the string identifier of an activation function.
sigmoid(...): Sigmoid activation function, sigmoid(x) = 1 / (1 + exp(-x)).
softmax(...): Softmax converts a real vector to a vector of categorical probabilities.
softplus(...): Softplus activation function, softplus(x) = log(exp(x) + 1).
softsign(...): Softsign activation function, softsign(x) = x / (abs(x) + 1).
swish(...): Swish activation function, swish(x) = x * sigmoid(x).
tanh(...): Hyperbolic tangent activation function.
'๐พ Deep Learning' ์นดํ ๊ณ ๋ฆฌ์ ๋ค๋ฅธ ๊ธ
VAE(Variational Autoencoder) (1) (0) | 2021.02.18 |
---|---|
nvidia-smi ์ต์ (0) | 2021.02.16 |
RNN์ ์ด์ฉํ ์ด๋ฏธ์ง ์์ฑ(feat.MNIST) (0) | 2021.02.15 |
[DL] GRU (gated recurrent unit) (0) | 2021.02.10 |
XG ๋ถ์คํธ(eXtream Gradient Boosting) (0) | 2021.02.09 |