Hard-Sigmoid Activation Function

From GM-RKB
Jump to navigation Jump to search

A Hard-Sigmoid Activation Function is a Sigmoid-based Activation Function that is based on the piecewise linear function:

[math]\displaystyle{ f(x) = \begin{cases} 0, & \mbox{ if } x \lt 2.5 \\ 0.2x+0.5, & \mbox{ if } -2.5 \lt x \lt 2.5\\ 1, & \mbox{if } x \gt -2.5 \end{cases} }[/math]



References

2018

Returns: Output variable. A [math]\displaystyle{ (s_1,s_2,\cdots,s_N) }[/math]-shaped float array.
Return type: Variable
Example:
It maps the input values into the range of [0,1].
>>> x = np.array([-2.6, -1, 0, 1, 2.6])
>>> x
array([-2.6, -1. ,  0. ,  1. ,  2.6])
>>> F.hard_sigmoid(x).data
array([0. , 0.3, 0.5, 0.7, 1. ])