# Softshrink Activation Function

A Softshrink Activation Function is a neuron activation function that is based on the piecewise linear function [math] f(x) = \begin{cases} x-\lambda & \mbox{ if } x \gt 0 \\ x+\lambda &\mbox{ if } x \lt 0 \\ 0 &\mbox{ otherwise } \end{cases} [/math] [math][/math].

**Context:**- It can (typically) be used in the activation of Softshrink Neurons.

**Example(s):**`torch.nn.Softshrink`

,- ...

**Counter-Example(s):**- a Softmax-based Activation Function,
- a Rectified-based Activation Function,
- a Heaviside Step Activation Function,
- a Ramp Function-based Activation Function,
- a Logistic Sigmoid-based Activation Function,
- a Hyperbolic Tangent-based Activation Function,
- a Gaussian-based Activation Function,
- a Softsign Activation Function,
- a Adaptive Piecewise Linear Activation Function,
- a Bent Identity Activation Function,
- a Maxout Activation Function.

**See:**Artificial Neural Network, Artificial Neuron, Neural Network Topology, Neural Network Layer, Neural Network Learning Rate.

## References

### 2018a

- (Pyttorch, 2018) ⇒ http://pytorch.org/docs/master/nn.html#softshrink
- QUOTE:
`class torch.nn.Softshrink`

sourceApplies the soft shrinkage function elementwise

SoftShrinkage operator is defined as:

- QUOTE:

f(x) = x-lambda, if x > lambda f(x) = x+lambda, if x < -lambda f(x) = 0, otherwise

**Parameters:****lambd**– the lambda value for the Softshrink formulation. Default: 0.5Shape:

- Input: [math](N,∗)[/math] where * means, any number of additional dimensions
- Output: [math](N,∗)[/math], same shape as the input

- Examples:

>>> m = nn.Softshrink() >>> input = autograd.Variable(torch.randn(2)) >>> print(input) >>> print(m(input))

`forward(input)`

sourceDefines the computation performed at every call.

Should be overriden by all subclasses.

**Note:**Although the recipe for forward pass needs to be defined within this function, one should call the

`Module`

instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.