Automatic Differentiation#
backward#
def backward(outputs: 'Any', cotangents: 'Any' = None, *, create_graph: 'bool' = False) -> 'None':
PyTorch-style backward pass that populates .grad on requires_grad tensors.
This function:
Builds a Trace from outputs using compute_for_backward()
Traverses through all OpNodes back to true leaves.
Collects all tensors with requires_grad=True as gradient leaves.
Runs VJP on the trace.
Populates .grad attributes on the collected gradient leaves.
Batch-realizes all gradients for efficiency.