Activation Modules

Activation Modules#

ReLU#

class ReLU() -> 'None':

Base class for imperative neural-network modules.

Modules are registered as pytree nodes, enabling direct use with transforms.


GELU#

class GELU() -> 'None':

Base class for imperative neural-network modules.

Modules are registered as pytree nodes, enabling direct use with transforms.


Sigmoid#

class Sigmoid() -> 'None':

Base class for imperative neural-network modules.

Modules are registered as pytree nodes, enabling direct use with transforms.


Tanh#

class Tanh() -> 'None':

Base class for imperative neural-network modules.

Modules are registered as pytree nodes, enabling direct use with transforms.


SiLU#

class SiLU() -> 'None':

Base class for imperative neural-network modules.

Modules are registered as pytree nodes, enabling direct use with transforms.


Softmax#

class Softmax(axis: 'int' = -1) -> 'None':

Apply softmax along a given axis (default: last).