Activations
Activation functions for neural networks.
-
nabla.nn.layers.activations.relu(x)[source]
Rectified Linear Unit activation function.
- Parameters:
x (Array) – Input array
- Returns:
Array with ReLU applied element-wise
- Return type:
Array
-
nabla.nn.layers.activations.leaky_relu(x, negative_slope=0.01)[source]
Leaky ReLU activation function.
- Parameters:
-
- Returns:
Array with Leaky ReLU applied element-wise
- Return type:
Array
-
nabla.nn.layers.activations.sigmoid(x)[source]
Sigmoid activation function.
- Parameters:
x (Array) – Input array
- Returns:
Array with sigmoid applied element-wise
- Return type:
Array
-
nabla.nn.layers.activations.tanh(x)[source]
Hyperbolic tangent activation function.
- Parameters:
x (Array) – Input array
- Returns:
Array with tanh applied element-wise
- Return type:
Array
-
nabla.nn.layers.activations.gelu(x)[source]
Gaussian Error Linear Unit activation function.
GELU(x) = x * Φ(x) where Φ(x) is the CDF of standard normal distribution.
Approximation: GELU(x) ≈ 0.5 * x * (1 + tanh(√(2/π) * (x + 0.044715 * x^3)))
- Parameters:
x (Array) – Input array
- Returns:
Array with GELU applied element-wise
- Return type:
Array
-
nabla.nn.layers.activations.swish(x, beta=1.0)[source]
Swish (SiLU) activation function.
Swish(x) = x * sigmoid(β * x)
When β = 1, this is SiLU (Sigmoid Linear Unit).
- Parameters:
-
- Returns:
Array with Swish applied element-wise
- Return type:
Array
-
nabla.nn.layers.activations.silu(x)[source]
Sigmoid Linear Unit (SiLU) activation function.
SiLU(x) = x * sigmoid(x) = Swish(x, β=1)
- Parameters:
x (Array) – Input array
- Returns:
Array with SiLU applied element-wise
- Return type:
Array
-
nabla.nn.layers.activations.softmax(x, axis=-1)[source]
Softmax activation function.
- Parameters:
-
- Returns:
Array with softmax applied along specified axis
- Return type:
Array
-
nabla.nn.layers.activations.log_softmax(x, axis=-1)[source]
Log-softmax activation function.
- Parameters:
-
- Returns:
Array with log-softmax applied along specified axis
- Return type:
Array
-
nabla.nn.layers.activations.get_activation(name)[source]
Get activation function by name.
- Parameters:
name (str) – Name of the activation function
- Returns:
Activation function
- Raises:
ValueError – If activation function is not found