Class FNN

java.lang.Object
io.github.kirstenali.deepj.layers.FNN
All Implemented Interfaces:
Layer, Trainable

public final class FNN extends Object implements Layer
Flexible fully-connected neural network (MLP) built from Linear projections.

This class is intentionally small: it exists as a convenience for users who want a classic ANN-style model while deepj remains transformer-oriented.

Design notes:

  • Uses Linear so parameters can be optimized externally (AdamW/SGD/etc.).
  • Accepts an ActivationFunction factory to avoid state-sharing bugs during backprop.