Activation

class colibri.models.custom_layers.Activation(activation='relu')[source]

Bases: Module

Activation Layer

get_activation(name)[source]

Get activation function by name.

Parameters:

name (str) – Name of the activation function.

Returns:

Activation function

Return type:

nn.Module

forward(x)[source]

Computes the activation function.

Parameters:

x (torch.Tensor) – Input tensor.

Returns:

Output tensor.

Return type:

torch.Tensor