Clipped Rectifier Unit Activation Function

From GM-RKB
Jump to navigation Jump to search

A Clipped Rectifier Unit Activation Function is a Rectified-based Activation Function that is thresholded at a clipping value [math]\displaystyle{ z }[/math], i.e. [math]\displaystyle{ f(x,z)=min(max(0,x),z) }[/math].



References

2018

Returns: Output variable. A [math]\displaystyle{ (s_1,s_2,\cdots,s_N) }[/math]-shaped float array.
Return type: Variable
Example:
>>> x = np.random.uniform(-100, 100, (10, 20)).astype('f')
>>> z = 10.0
>>> np.any(x < 0)
True
>>> np.any(x > z)
True
>>> y = F.clipped_relu(x, z=z)
>>> np.any(y.data < 0)
False
>>> np.any(y.data > z)
False