HardTanh Activation Function

From GM-RKB
Jump to navigation Jump to search

A HardTanh Activation Function is a Hyperbolic Tangent-based Activation Function that is based on the piecewise function:

[math]\displaystyle{ f(x) = \begin{cases} +1, & \mbox{ if } x \gt 1 \\ -1, & \mbox{ if } x \lt -1\\ x, & \mbox{ otherwise} \end{cases} }[/math]



References

2018

f(x) = +1, if x  >  1
f(x) = -1, if x  < -1
f(x) =  x,  otherwise
The range of the linear region [−1,1] can be adjusted.

Parameters:

Keyword arguments min_value and max_value have been deprecated in favor of min_val and max_val.

Shape:

 ::* Input: [math]\displaystyle{ (N,∗) }[/math] where * means, any number of additional dimensions

  • Output: [math]\displaystyle{ (N,∗) }[/math], same shape as the input.
Examples:
>>> m = nn.Hardtanh(-2, 2)
>>> input = autograd.Variable(torch.randn(2))
>>> print(input)
>>> print(m(input))