analogvnn.nn.activation.SiLU
#
Module Contents#
Classes#
Implements the SiLU activation function. |
- class analogvnn.nn.activation.SiLU.SiLU[source]#
Bases:
analogvnn.nn.activation.Activation.Activation
Implements the SiLU activation function.
- static forward(x: torch.Tensor) torch.Tensor [source]#
Forward pass of the SiLU.
- Parameters:
x (Tensor) – the input tensor.
- Returns:
the output tensor.
- Return type:
Tensor
- backward(grad_output: Optional[torch.Tensor]) Optional[torch.Tensor] [source]#
Backward pass of the SiLU.
- Parameters:
grad_output (Optional[Tensor]) – the gradient of the output tensor.
- Returns:
the gradient of the input tensor.
- Return type:
Optional[Tensor]