cvnets.layers.activation package

Submodules

cvnets.layers.activation.gelu module

class cvnets.layers.activation.gelu.GELU(*args, **kwargs)[source]

Bases: GELU

Applies the Gaussian Error Linear Units function

__init__(*args, **kwargs) None[source]

Initializes internal Module state, shared by both nn.Module and ScriptModule.

cvnets.layers.activation.hard_sigmoid module

class cvnets.layers.activation.hard_sigmoid.Hardsigmoid(inplace: bool | None = False, *args, **kwargs)[source]

Bases: Hardsigmoid

Applies the Hard Sigmoid function

__init__(inplace: bool | None = False, *args, **kwargs) None[source]

Initializes internal Module state, shared by both nn.Module and ScriptModule.

forward(input: Tensor, *args, **kwargs) Tensor[source]

Defines the computation performed at every call.

Should be overridden by all subclasses.

Note

Although the recipe for forward pass needs to be defined within this function, one should call the Module instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.

cvnets.layers.activation.hard_swish module

class cvnets.layers.activation.hard_swish.Hardswish(inplace: bool | None = False, *args, **kwargs)[source]

Bases: Hardswish

Applies the HardSwish function, as described in the paper Searching for MobileNetv3

__init__(inplace: bool | None = False, *args, **kwargs) None[source]

Initializes internal Module state, shared by both nn.Module and ScriptModule.

forward(input: Tensor, *args, **kwargs) Tensor[source]

Defines the computation performed at every call.

Should be overridden by all subclasses.

Note

Although the recipe for forward pass needs to be defined within this function, one should call the Module instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.

cvnets.layers.activation.leaky_relu module

class cvnets.layers.activation.leaky_relu.LeakyReLU(negative_slope: float | None = 0.01, inplace: bool | None = False, *args, **kwargs)[source]

Bases: LeakyReLU

Applies a leaky relu function. See Rectifier Nonlinearities Improve Neural Network Acoustic Models for more details.

__init__(negative_slope: float | None = 0.01, inplace: bool | None = False, *args, **kwargs) None[source]

Initializes internal Module state, shared by both nn.Module and ScriptModule.

cvnets.layers.activation.prelu module

class cvnets.layers.activation.prelu.PReLU(num_parameters: int | None = 1, init: float | None = 0.25, *args, **kwargs)[source]

Bases: PReLU

Applies the Parametric Rectified Linear Unit function

__init__(num_parameters: int | None = 1, init: float | None = 0.25, *args, **kwargs) None[source]

Initializes internal Module state, shared by both nn.Module and ScriptModule.

cvnets.layers.activation.relu module

class cvnets.layers.activation.relu.ReLU(inplace: bool | None = False, *args, **kwargs)[source]

Bases: ReLU

Applies Rectified Linear Unit function

__init__(inplace: bool | None = False, *args, **kwargs) None[source]

Initializes internal Module state, shared by both nn.Module and ScriptModule.

cvnets.layers.activation.relu6 module

class cvnets.layers.activation.relu6.ReLU6(inplace: bool | None = False, *args, **kwargs)[source]

Bases: ReLU6

Applies the ReLU6 function

__init__(inplace: bool | None = False, *args, **kwargs) None[source]

Initializes internal Module state, shared by both nn.Module and ScriptModule.

cvnets.layers.activation.sigmoid module

class cvnets.layers.activation.sigmoid.Sigmoid(*args, **kwargs)[source]

Bases: Sigmoid

Applies the sigmoid function

__init__(*args, **kwargs) None[source]

Initializes internal Module state, shared by both nn.Module and ScriptModule.

cvnets.layers.activation.swish module

class cvnets.layers.activation.swish.Swish(inplace: bool | None = False, *args, **kwargs)[source]

Bases: SiLU

Applies the Swish (also known as SiLU) function.

__init__(inplace: bool | None = False, *args, **kwargs) None[source]

Initializes internal Module state, shared by both nn.Module and ScriptModule.

cvnets.layers.activation.tanh module

class cvnets.layers.activation.tanh.Tanh(*args, **kwargs)[source]

Bases: Tanh

Applies Tanh function

__init__(*args, **kwargs) None[source]

Initializes internal Module state, shared by both nn.Module and ScriptModule.

Module contents

cvnets.layers.activation.register_act_fn(name)[source]
cvnets.layers.activation.arguments_activation_fn(parser: ArgumentParser)[source]
cvnets.layers.activation.build_activation_layer(opts: Namespace, act_type: str | None = None, inplace: bool | None = None, negative_slope: float | None = None, num_parameters: int = -1) Module[source]

Helper function to build the activation function. If any of the optional arguments are not provided (i.e. None), the corresponding model.activation.* config entry will be used as default value.

Parameters:
  • act_type – Name of the activation layer. Default: –model.activation.name config value.

  • inplace – If true, operation will be inplace. Default: –model.activation.inplace config value.

  • negative_slope – Negative slope parameter for leaky_relu. Default: –model.activation.neg_slop config value.