Hard-Sigmoid Activation Function
Jump to navigation
Jump to search
A Hard-Sigmoid Activation Function is a Sigmoid-based Activation Function that is based on the piecewise linear function:
[math]\displaystyle{ f(x) = \begin{cases} 0, & \mbox{ if } x \lt 2.5 \\ 0.2x+0.5, & \mbox{ if } -2.5 \lt x \lt 2.5\\ 1, & \mbox{if } x \gt -2.5 \end{cases} }[/math]
- Context:
- It can (typically) be used in the activation of Hard-Sigmoid Neurons.
- Example(s):
- Counter-Example(s):
- a Log-Sigmoid Activation Function,
- a Rectified-based Activation Function,
- a Heaviside Step Activation Function,
- a Ramp Function-based Activation Function,
- a Softmax-based Activation Function,
- a Hyperbolic Tangent-based Activation Function,
- a Gaussian-based Activation Function,
- a Softmin Activation Function,
- a Softsign Activation Function,
- a Softshrink Activation Function,
- a Adaptive Piecewise Linear Activation Function,
- a Bent Identity Activation Function,
- a Maxout Activation Function.
- See: Sigmoid Function, Artificial Neural Network, Artificial Neuron, Neural Network Topology, Neural Network Layer, Neural Network Learning Rate.
References
2018
- (Chainer, 2018) ⇒ http://docs.chainer.org/en/stable/reference/generated/chainer.functions.hard_sigmoid.html Retrieved:2018-2-18
- QUOTE:
chainer.functions.hard_sigmoid(x)
sourceElement-wise hard-sigmoid function.
This function is defined as
[math]\displaystyle{ f(x) = \begin{cases} 0, & \mbox{ if } x \lt 2.5 \\ 0.2x+0.5, & \mbox{ if } -2.5 \lt x \lt 2.5\\ 1, & \mbox{if } -2.5\lt x \end{cases} }[/math]
Parameters:
- x (Variable or
numpy.ndarray
orcupy.ndarray
) – Input variable. A [math]\displaystyle{ (s_1,s_2,\cdots,s_N) }[/math]-shaped float array.
- x (Variable or
- QUOTE:
- Returns: Output variable. A [math]\displaystyle{ (s_1,s_2,\cdots,s_N) }[/math]-shaped float array.
- Return type: Variable
- Example:
- It maps the input values into the range of [0,1].
>>> x = np.array([-2.6, -1, 0, 1, 2.6]) >>> x array([-2.6, -1. , 0. , 1. , 2.6]) >>> F.hard_sigmoid(x).data array([0. , 0.3, 0.5, 0.7, 1. ])