Class FNN
java.lang.Object
io.github.kirstenali.deepj.layers.FNN
Flexible fully-connected neural network (MLP) built from
Linear projections.
This class is intentionally small: it exists as a convenience for users who want a classic ANN-style model while deepj remains transformer-oriented.
Design notes:
- Uses
Linearso parameters can be optimized externally (AdamW/SGD/etc.). - Accepts an
ActivationFunctionfactory to avoid state-sharing bugs during backprop.
-
Constructor Summary
ConstructorsConstructorDescriptionFNN(int inputSize, int[] hiddenSizes, int outputSize, Supplier<ActivationFunction> hiddenActivationFactory, ActivationFunction outputActivation, Random rnd) Build an MLP of the form: Linear -> act -> Linear -> act -> ...FNN(int inputSize, int[] hiddenSizes, int outputSize, Supplier<ActivationFunction> hiddenActivationFactory, Random rnd) -
Method Summary
-
Constructor Details
-
FNN
public FNN(int inputSize, int[] hiddenSizes, int outputSize, Supplier<ActivationFunction> hiddenActivationFactory, ActivationFunction outputActivation, Random rnd) Build an MLP of the form: Linear -> act -> Linear -> act -> ... -> Linear -> (optional outputAct) -
FNN
public FNN(int inputSize, int[] hiddenSizes, int outputSize, Supplier<ActivationFunction> hiddenActivationFactory, Random rnd)
-
-
Method Details
-
forward
-
backward
-
parameters
- Specified by:
parametersin interfaceTrainable
-