Regularization#
Regularization techniques for neural networks.
- nabla.nn.utils.regularization.l1_regularization(params, weight=0.01)[source]#
Compute L1 (Lasso) regularization loss.
L1 regularization adds a penalty equal to the sum of absolute values of parameters. This encourages sparsity in the model parameters.
- nabla.nn.utils.regularization.l2_regularization(params, weight=0.01)[source]#
Compute L2 (Ridge) regularization loss.
L2 regularization adds a penalty equal to the sum of squares of parameters. This encourages small parameter values and helps prevent overfitting.
- nabla.nn.utils.regularization.elastic_net_regularization(params, l1_weight=0.01, l2_weight=0.01, l1_ratio=0.5)[source]#
Compute Elastic Net regularization loss.
Elastic Net combines L1 and L2 regularization: ElasticNet = l1_ratio * L1 + (1 - l1_ratio) * L2
- Parameters:
- Returns:
Scalar Elastic Net regularization loss
- Return type:
- nabla.nn.utils.regularization.dropout(x, p=0.5, training=True, seed=None)[source]#
Apply dropout regularization.
During training, randomly sets elements to zero with probability p. During inference, scales all elements by (1-p) to maintain expected values.
- nabla.nn.utils.regularization.spectral_normalization(weight, u=None, n_iterations=1)[source]#
Apply spectral normalization to weight matrix.
Spectral normalization constrains the spectral norm (largest singular value) of weight matrices to be at most 1. This stabilizes training of GANs.
- Parameters:
- Returns:
Tuple of (normalized_weight, updated_u)
- Return type: