JavaScript is disabled on your browser.
Skip navigation links
Overview
Package
Class
Tree
Index
Help
Package:
Description
Related Packages
Classes and Interfaces
Package:
Description |
Related Packages |
Classes and Interfaces
SEARCH
Package io.github.kirstenali.deepj.activations
package
io.github.kirstenali.deepj.activations
All Classes and Interfaces
Interfaces
Classes
Class
Description
ActivationFunction
GELU
Gaussian Error Linear Unit (GELU), using the tanh approximation popularized by GPT-2.
ReLU
Sigmoid
Softmax
Row-wise softmax for 2D tensors: applies softmax independently to each row.
Tanh