activation ์ข
๋ฅ
ยท
๐พ Deep Learning
deserialize(...): Returns activation function given a string identifier. elu(...): Exponential Linear Unit. exponential(...): Exponential activation function. gelu(...): Applies the Gaussian error linear unit (GELU) activation function. get(...): Returns function. hard_sigmoid(...): Hard sigmoid activation function. linear(...): Linear activation function (pass-through). relu(...): Applies the r..