Rectified Linear Unit (ReLU) Activation Function

From GM-RKB
(Redirected from ReLU activation function)
Jump to navigation Jump to search

A Rectified Linear Unit (ReLU) Activation Function is a neuron activation function whose values on negative input are attenuated.



References

2018a

  • (Pyttorch, 2018) ⇒ http://pytorch.org/docs/master/nn.html#relu
    • QUOTE: class torch.nn.ReLU(inplace=False) source

      AApplies the rectified linear unit function element-wise [math]\displaystyle{ ReLU(x)=max(0,x) }[/math]

      Parameters: inplace – can optionally do the operation in-place. Default: False

      Shape:

      *** Input: [math]\displaystyle{ (N,∗) }[/math] where * means, any number of additional dimensions

      • Output: [math]\displaystyle{ (N,∗) }[/math], same shape as the input
Examples:
>>> m = nn.ReLU()
>>> input = autograd.Variable(torch.randn(2))
>>> print(input)
>>> print(m(input))

2018b

2018c

2018d

2017a

2017b

2005

1986


  1. R Hahnloser, R. Sarpeshkar, M A Mahowald, R. J. Douglas, H.S. Seung (2000). Digital selection and analogue amplification coexist in a cortex-inspired silicon circuit. Nature. 405. pp. 947–951.
  2. R Hahnloser, H.S. Seung (2001). Permitted and Forbidden Sets in Symmetric Threshold-Linear Networks. NIPS 2001.
  3. Xavier Glorot, Antoine Bordes and Yoshua Bengio (2011). Deep sparse rectifier neural networks (PDF). AISTATS.
  4. Vinod Nair and Geoffrey Hinton (2010). Rectified linear units improve restricted Boltzmann machines (PDF). ICML.
  5. C. Dugas, Y. Bengio, F. Bélisle, C. Nadeau, R. Garcia, NIPS'2000, (2001),Incorporating Second Order Functional Knowledge for Better Option Pricing.
  6. László Tóth (2013). Phone Recognition with Deep Sparse Rectifier Neural Networks (PDF). ICASSP.
  7. Andrew L. Maas, Awni Y. Hannun, Andrew Y. Ng (2014). Rectifier Nonlinearities Improve Neural Network Acoustic Models