Softshrink Activation Function

From GM-RKB
Jump to navigation Jump to search

A Softshrink Activation Function is a neuron activation function that is based on the piecewise linear function [math]\displaystyle{ f(x) = \begin{cases} x-\lambda & \mbox{ if } x \gt 0 \\ x+\lambda &\mbox{ if } x \lt 0 \\ 0 &\mbox{ otherwise } \end{cases} }[/math] [math]\displaystyle{ }[/math].



References

2018a

f(x) = x-lambda, if x > lambda 
f(x) = x+lambda, if x < -lambda
f(x) = 0, otherwise
Parameters: lambd – the lambda value for the Softshrink formulation. Default: 0.5

Shape:

  • Input: [math]\displaystyle{ (N,∗) }[/math] where * means, any number of additional dimensions
  • Output: [math]\displaystyle{ (N,∗) }[/math], same shape as the input
Examples:
>>> m = nn.Softshrink()
>>> input = autograd.Variable(torch.randn(2))
>>> print(input)
>>> print(m(input))
forward(input) source

Defines the computation performed at every call.

Should be overriden by all subclasses.

Note:

Although the recipe for forward pass needs to be defined within this function, one should call the Module instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.