analogvnn.graph.BackwardGraph
#
Module Contents#
Classes#
The backward graph. |
- class analogvnn.graph.BackwardGraph.BackwardGraph(graph_state: analogvnn.graph.ModelGraphState.ModelGraphState = None)[source]#
Bases:
analogvnn.graph.AcyclicDirectedGraph.AcyclicDirectedGraph
The backward graph.
- __call__(gradient: analogvnn.utils.common_types.TENSORS = None) analogvnn.graph.ArgsKwargs.ArgsKwargsOutput [source]#
Backward pass through the backward graph.
- Parameters:
gradient (TENSORS) – gradient of the loss function w.r.t. the output of the forward graph
- Returns:
gradient of the inputs function w.r.t. loss
- Return type:
ArgsKwargsOutput
- compile(is_static=True)[source]#
Compile the graph.
- Parameters:
is_static (bool) – If True, the graph is not changing during runtime and will be cached.
- Returns:
self.
- Return type:
- Raises:
ValueError – If no forward pass has been performed yet.
- from_forward(forward_graph: Union[analogvnn.graph.AcyclicDirectedGraph.AcyclicDirectedGraph, networkx.DiGraph]) BackwardGraph [source]#
Create a backward graph from inverting forward graph.
- Parameters:
forward_graph (Union[AcyclicDirectedGraph, nx.DiGraph]) – The forward graph.
- Returns:
self.
- Return type:
- calculate(*args, **kwargs) analogvnn.graph.ArgsKwargs.ArgsKwargsOutput [source]#
Calculate the gradient of the whole graph w.r.t. loss.
- Parameters:
*args – The gradients args of outputs.
**kwargs – The gradients kwargs of outputs.
- Returns:
The gradient of the inputs function w.r.t. loss.
- Return type:
ArgsKwargsOutput
- Raises:
ValueError – If no forward pass has been performed yet.
- _pass(from_node: analogvnn.graph.GraphEnum.GRAPH_NODE_TYPE, *args, **kwargs) Dict[analogvnn.graph.GraphEnum.GRAPH_NODE_TYPE, analogvnn.graph.ArgsKwargs.InputOutput] [source]#
Perform the backward pass through the graph.
- Parameters:
from_node (GRAPH_NODE_TYPE) – The node to start the backward pass from.
*args – The gradients args of outputs.
**kwargs – The gradients kwargs of outputs.
- Returns:
The input and output gradients of each node.
- Return type:
Dict[GRAPH_NODE_TYPE, InputOutput]
- _calculate_gradients(module: Union[analogvnn.graph.AccumulateGrad.AccumulateGrad, analogvnn.nn.module.Layer.Layer, analogvnn.backward.BackwardModule.BackwardModule, Callable], grad_outputs: analogvnn.graph.ArgsKwargs.InputOutput) analogvnn.graph.ArgsKwargs.ArgsKwargs [source]#
Calculate the gradient of a module w.r.t. outputs of the module using the output’s gradients.
- Parameters:
module (Union[AccumulateGrad, Layer, BackwardModule, Callable]) – The module to calculate the gradient of.
grad_outputs (InputOutput) – The gradients of the output of the module.
- Returns:
The input gradients of the module.
- Return type: