cvnets.layers.activation package
Submodules
cvnets.layers.activation.gelu module
- class cvnets.layers.activation.gelu.GELU(*args, **kwargs)[source]
Bases:
GELU
Applies the Gaussian Error Linear Units function
cvnets.layers.activation.hard_sigmoid module
- class cvnets.layers.activation.hard_sigmoid.Hardsigmoid(inplace: bool | None = False, *args, **kwargs)[source]
Bases:
Hardsigmoid
Applies the Hard Sigmoid function
- __init__(inplace: bool | None = False, *args, **kwargs) None [source]
Initializes internal Module state, shared by both nn.Module and ScriptModule.
- forward(input: Tensor, *args, **kwargs) Tensor [source]
Defines the computation performed at every call.
Should be overridden by all subclasses.
Note
Although the recipe for forward pass needs to be defined within this function, one should call the
Module
instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.
cvnets.layers.activation.hard_swish module
- class cvnets.layers.activation.hard_swish.Hardswish(inplace: bool | None = False, *args, **kwargs)[source]
Bases:
Hardswish
Applies the HardSwish function, as described in the paper Searching for MobileNetv3
- __init__(inplace: bool | None = False, *args, **kwargs) None [source]
Initializes internal Module state, shared by both nn.Module and ScriptModule.
- forward(input: Tensor, *args, **kwargs) Tensor [source]
Defines the computation performed at every call.
Should be overridden by all subclasses.
Note
Although the recipe for forward pass needs to be defined within this function, one should call the
Module
instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.
cvnets.layers.activation.leaky_relu module
cvnets.layers.activation.prelu module
- class cvnets.layers.activation.prelu.PReLU(num_parameters: int | None = 1, init: float | None = 0.25, *args, **kwargs)[source]
Bases:
PReLU
Applies the Parametric Rectified Linear Unit function
cvnets.layers.activation.relu module
cvnets.layers.activation.relu6 module
cvnets.layers.activation.sigmoid module
cvnets.layers.activation.swish module
- class cvnets.layers.activation.swish.Swish(inplace: bool | None = False, *args, **kwargs)[source]
Bases:
SiLU
Applies the Swish (also known as SiLU) function.
cvnets.layers.activation.tanh module
Module contents
- cvnets.layers.activation.build_activation_layer(opts: Namespace, act_type: str | None = None, inplace: bool | None = None, negative_slope: float | None = None, num_parameters: int = -1) Module [source]
Helper function to build the activation function. If any of the optional arguments are not provided (i.e. None), the corresponding
model.activation.*
config entry will be used as default value.- Parameters:
act_type – Name of the activation layer. Default: –model.activation.name config value.
inplace – If true, operation will be inplace. Default: –model.activation.inplace config value.
negative_slope – Negative slope parameter for leaky_relu. Default: –model.activation.neg_slop config value.