layers
sorix.nn.layers ¶
Linear ¶
Bases: Module
Applies a linear transformation to the incoming data.
Attributes:
Examples:
Source code in sorix/nn/layers.py
ReLU ¶
Sigmoid ¶
Tanh ¶
BatchNorm1d ¶
Bases: Module
Applies Batch Normalization over a 2D input.
Source code in sorix/nn/layers.py
Dropout ¶
Bases: Module
During training, randomly zeroes some of the elements of the input tensor with probability p using samples from a Bernoulli distribution.
This implementation uses Inverted Dropout, meaning that the output is scaled by 1/(1-p) during training. This ensures that the expected value of the activations remains constant, allowing the layer to act as an identity function during inference.
Parameters:
-
p(float, default:0.5) –Probability of an element to be zeroed. Default: 0.5