S-shaped Rectified Linear Activation Function

From GM-RKB
Jump to navigation Jump to search

A S-shaped Rectified Linear Activation Function is a rectified-based activation function that is based on a S-shaped function.



References

2017

  • (Mate Labs, 2017) ⇒ Mate Labs Aug 23, 2017. Secret Sauce behind the beauty of Deep Learning: Beginners guide to Activation Functions
    • QUOTE:  S-shaped Rectified Linear Activation Unit (SReLU)

      Range: [math]\displaystyle{ (-\infty,+\infty) }[/math]

      [math]\displaystyle{ f_{t_l,a_l,t_r,a_r}(x) = \begin{cases} t_l+a_l(x-t_l) & \mbox{for } x \le t_l \\ x & \mbox{for } t_l\lt x \lt t_r\\ t_r+a_r(x-t_r) & \mbox{for } x \ge t_r\end{cases} }[/math]

      with [math]\displaystyle{ t_l,\; a_l, \;t_r,\; a_r }[/math] are parameters.

2016